
Table of Contents
What are Online Platforms?
Online platforms occupy the regulatory space between basic hosting and massive VLOP scale. These services store user content and make it publicly accessible, creating specific regulatory responsibilities. Social media networks, online marketplaces, app stores, and content-sharing services all fall within this category.
The key distinction lies in active facilitation of public communication and commerce rather than simple information warehousing like basic hosting providers. Platform users publish content that reaches broader audiences through platform interfaces, creating specific content management responsibilities.
This designation triggers enhanced obligations because platforms shape online discourse and commercial activity across Europe. These services become spaces where people discover products, share opinions, and build communities.
DSA Obligations for Online Platforms
Here’s the reality: if you operate an online platform, the DSA fundamentally changes how you do business. These aren’t just compliance boxes to tick; they’re operational requirements that touch every aspect of how your platform works. The full weight of these obligations landed on 17 February 2024, and there’s no going back.

▶ Content Moderation and User Protection
Platforms need complaint-handling systems that function effectively rather than contact forms that disappear into digital voids. Users disagreeing with content decisions should access appeal mechanisms through accessible, free systems providing genuine review.
Content removal or account suspension requires specific, actionable explanations rather than generic boilerplate responses. Users deserve clear information about policy violations and resolution procedures.
Trusted flagger programmes ensure that notifications coming from specifically designated entities are treated with priority. Organisations with expertise in the identification of illegal content and acting independently from any online platform provider may apply to their national Digital Services Coordinator to gain this status.
▶ Advertising Transparency
Every platform advertisement requires crystal-clear identification without blurred lines between organic content and paid promotion.
- ▶ Real-time Advertisement Identification: Users should never question whether they’re viewing advertisements. Visual markers must be prominent and unmistakable.
- ▶ Advertiser Identity Disclosure: Advertisement payment sources and promotional subjects must be immediately accessible from advertisements themselves.
- ▶ Targeting Parameter Transparency: Platforms must provide meaningful information about targeting criteria for each advertisement shown to users. Users also need information about available options to modify targeting parameters. Choice and control options for certain categories and advertising preference adjustments must be readily accessible.
- ▶ Prohibited Advertising Practices: Platforms cannot use GDPR special category data for advertisement targeting. Sexual orientation, religious beliefs, ethnic origin, and health information remain off-limits for advertising purposes regardless of user consent.

▶ Recommender System Disclosure
Platform algorithms can no longer remain entirely proprietary. Users are entitled to receive detailed information about how recommendation systems determine feed content appearance.
This includes ranking factors, content promotion criteria, and user options for influencing personalised recommendations.
▶ Enhanced Transparency Reporting
Annual transparency reports must include detailed content moderation statistics, user complaint volumes and resolution times, and automated detection system effectiveness metrics.
Effective reports tell improvement stories rather than simply documenting completed activities.
Check obligations that apply to other categories
