What happened
On 29 October 2025, the Swiss Federal Council opened a consultation on a new law for very large communication platforms and search engines. The goal is to strengthen user rights, increase fairness and transparency, and set baseline rules for services such as Facebook, X, TikTok and Google. The consultation is open until 16 February 2026.
This is not a “digital policy only” topic. Even if your company is not a platform, the draft could reshape how:
-
advertising and targeting are documented,
-
content removals and account blocks are challenged,
-
illegal content is reported and handled,
-
and how large services provide access to data for authorities and researchers.
Who the draft law targets
The draft is limited to very large services because of their outsized impact on public debate. The Federal Council describes “very large” as services used by an average of 10% of the permanent Swiss resident population per month (currently around 900,000 users).
So, for most SMEs this is not a direct compliance obligation. But it can still affect you in three practical ways:
-
As an advertiser (targeting, ad disclosures, ad archives).
-
As a brand (takedowns, account restrictions, dispute channels).
-
As a tech business (if you operate a product that could scale into the “very large” category, or if you run a marketplace/community with platform-like features).
Key obligations proposed in the draft (plain English)
1) Easier reporting of suspected illegal content
The Federal Council indicates the law would require large platforms to provide a procedure allowing users to easily report specific suspected unlawful content, depictions of violence and threatening language. The press release lists examples of reportable offences (e.g., defamation/insult/hate incitement under Swiss criminal law).
Why businesses should care:
-
If your brand is targeted by harmful content, a clearer reporting path can reduce time-to-removal.
-
If your team manages communities (comments, user-generated content, reviews), you should understand how “reporting” triggers and evidence expectations may evolve.
2) Stronger “explain yourself” duties for removals and blocks
The draft framework would require platforms to inform users when removing content or blocking accounts and to explain the grounds for the decision.
Why businesses should care:
-
Many companies depend on social accounts for sales and customer support. Sudden blocks can mean lost revenue and reputational damage.
-
If the rules become more structured, brands may have clearer escalation paths and better documentation for disputes.
3) Internal complaints and out-of-court dispute resolution
The Federal Council states that platforms would need to establish an internal complaint procedure and participate in out-of-court dispute resolution when conflicts arise.
Why businesses should care:
-
This could reduce “dead-end support tickets” and create a more predictable dispute workflow.
-
For brands, a consistent appeal route is often more valuable than “support luck.”
4) Advertising transparency, targeting disclosure, recommendation systems
The press release highlights transparency requirements related to advertising declaration and targeting and the use of recommendation systems. It also mentions a publicly accessible advertising archive and access to data for authorities and researchers.
Why businesses should care:
-
Marketing teams may need cleaner documentation of targeting logic and ad creative history.
-
Regulated industries (finance, health, crypto, legal services) will likely want stronger review controls and clearer “what we ran, where, and why” records.
5) Swiss legal representative for foreign-based providers
Very large platforms and search engines would be required to designate a legal representative in Switzerland if their registered office is abroad. The Federal Council frames this as strengthening enforcement against providers without offices in Switzerland.
Why businesses should care:
-
Enforcement becomes more practical when authorities and courts have a local point of contact.
-
In disputes (including brand protection), having a local legal representative can make escalation less opaque.
What companies should do now (practical preparation)
Because this is still a consultation, the most sensible approach is “prepare your operating model”, not “panic compliance.”
1) If your business depends heavily on paid ads
Create a simple “ad compliance file” that you can maintain going forward:
-
who approves ads (name/role),
-
what the ad claims are based on (proof files),
-
what audiences were targeted (high-level categories),
-
which campaigns ran and when (export from ad accounts).
This improves resilience even if the final law changes.
2) If your brand risk is reputation/content-driven
Put in place an internal playbook:
-
what is “reportable harmful content” for your brand,
-
who collects evidence (screenshots/URLs/time stamps),
-
who triggers escalation,
-
who communicates with the platform.
This reduces reaction time during a crisis.
3) If you run a platform-like service or community
Even if you are far from the “very large” threshold, borrow the logic:
-
create clear content rules,
-
define a takedown/appeal procedure,
-
keep decision logs.
This is not bureaucracy—this is “future-proofing” for growth, investor due diligence, and better user trust.
4) If you operate cross-border
Align internal policies with the reality that your product may be subject to multiple regulatory regimes (EU DSA-like expectations, local enforcement, advertising transparency, data protection). The draft explicitly targets the biggest services, but regulatory pressure tends to cascade down the ecosystem over time.
What to watch next
-
The consultation deadline is 16 February 2026.
-
The Federal Council also flags questions for consultation participants about protection of minors and the notification procedure, so those sections may evolve in the final proposal.
How Yudey can help
This topic typically turns into practical legal work in four areas:
-
Contracts & policies (platform terms, influencer/affiliate agreements, ad claim substantiation, internal approval procedures).
-
Data protection & compliance (risk mapping, documentation, incident playbooks).
-
Dispute strategy (account blocks, takedowns, reputational attacks, evidence handling).
-
Corporate readiness for tech businesses scaling in Switzerland (governance, recordkeeping, risk controls).
If you describe your business model (advertiser / e-commerce / SaaS / platform / media), we can propose a short checklist tailored to your exact risk profile.