New Digital Services Act: Scope & Obligations
Starting from February 17, 2024, the directly effective Digital Services Act1 (DSA) applies to all digital platforms. DSA regulates the rules for applying the liability of digital service providers for online users’ content and further conditions for content moderation.
If you are a digital service intermediary, the new DSA rules will most likely affect you too. Not sure if you are an obligated entity? Identify yourself according to the criteria below.
DSA rules apply to enterprises providing digital services as so-called intermediaries. Intermediaries are persons who connect users with third-party goods, services, and content. A condition for the application of DSA is that the intermediary handles third-party content, services, or goods, not their own. In the case that an entrepreneur provides only their content, DSA rules do not apply to them, and the content provider is responsible for their content.2 Some digital services or parts provided by the provider may fall under the DSA scope, while some of its services may not. In the case of such a combined offer of services by one provider, DSA is applied only to the extent that the services or their parts fall under the DSA scope.3
Intermediary services specifically include the following activities:
- mere conduit services - operators of public WiFi networks, internet access providers, VoIP (internet telephony), domain registrars, domain authorities, certification authorities issuing digital certificates;
- hosting services - cloud computing, web hosting, content-sharing
platforms, clouds, reference services,
- including online platforms (services that store and publicly disseminate information upon request) - online marketplaces, app stores, social networks including very large online platforms, travel and accommodation platforms, sharing economy platforms;
- caching - services to ensure smooth and efficient transmission of information provided on the internet, such as content delivery networks, reverse proxy servers.
We are an e-shop, does DSA apply to us? If you allow any user content on the e-shop, e.g., customer reviews, DSA applies.
We are a business providing digital services, but we are not based in the EU. DSA applies regardless of the establishment or location of the business. If you offer your services within the EU, DSA applies.
What should digital service providers prepare for to be compliant with DSA?
Specific rules and obligations are set proportionally based on the nature and size of the specific service. If your services have a minor reach, only basic obligations will apply to you, and vice versa. Additionally, if you are a small or micro-enterprise, you are directly exempt from some obligations.
All providers of intermediary services should particularly address the following obligations:
- update contractual terms to include new mandatory details and information (for example, introducing the procedure for moderation, i.e., restricting user content, decision-making based on algorithms, restrictions on providing service, termination of use, etc.),
- publish annual reports on content moderation performed over the last period; the first report should be published by February 17, 2025,
- designate a single point of contact.
Hosting services, including online platforms, in addition to the above, have further obligations:
- introduce mechanisms allowing users to flag and report illegal online content - all simply, visibly,
- handle reported notifications transparently and non-discriminatorily,
- provide users reasoning for any restrictions imposed on uploaded content or service use, account cancellation,
- inform public authorities in case of suspicion of a criminal offense endangering life or safety of persons.
Online platforms, beyond the aforementioned, also have these obligations:
- not to design online interfaces in a way that constitutes so-called dark patterns, i.e., not use practices that would prevent users from making free and informed decisions4, for example, not highlighting one option over others or making it difficult to cancel a user account,
- maintain transparency of advertising and inform users about the reasons for behavioral (targeted) advertising or that content is sponsored,
- ban targeted advertising to minors and also advertising based on sensitive personal data (sexual orientation, nationality, etc.),
- prioritize responses to notifications submitted by so-called trusted flaggers5,
- establish an internal system for handling complaints and alternative dispute resolution.
Online marketplaces (online platforms that allow consumers to conclude contracts with traders remotely) further must fulfill obligations:
- allow selling products and services only by traders who have provided mandatory information, thus applying the “know your business customer” principle,
- organize online interfaces so that traders can easily inform about the identification of offered products or services, provide a logo, symbol, or trademark identifying the trader or information about safety labels and marks,
- inform consumers who purchased a product or service from a trader, if they learn that the trader offered illegal product or service.
For very large online platforms and search engines, some of the above rules are tightened, and they have additional special rules.
What is illegal content? Online content that particularly represents/includes
- copyright infringement – uploading and using copyrighted works without the author’s consent,
- illegal products including counterfeits,
- illegal hate speech, terrorist content, discriminatory content,
- sharing child pornography, sharing private images without consent,
- cyberstalking,
- violation of consumer rights6.
Does the DSA bring any benefits besides a range of obligations?
For service providers:
DSA strengthens conditions for exemptions from liability for illegal content:
a. the intermediary will not bear responsibility for uploaded illegal content if unaware of it (and if they become aware, they must take steps to remove it),
b. the intermediary will not be liable even if they discover illegal content through their voluntary activities.
The introduction of uniform rules for the European market will support the development of micro and small enterprises.
For service users (including companies using online platforms):
- Easy reporting of illegal content, goods, and services.
- Being informed in case of removal of user content and the ability to appeal.
- Transparency of platforms including algorithms for content recommendation.
Concurrently with implementing these new obligations into company’s internal processes, there occurs extensive processing of personal data. Therefore, it is essential to maintain compliance with GDPR. We recommend reviewing the documentation regulating personal data processing and making the necessary changes.
Stuchlíková & Partners assists digital service providers in ensuring that all company processes are maximally efficient and fully compliant with the new regulation.
If you are interested in more information or in receiving an offer of our legal services, do not hesitate to contact us at info@stuchlikova.com or call +420 222 767 393.
Note: The list of obligations for providers according to the DSA is not exhaustive. For the sake of clarity, we have selected the most important ones.
Regulation of the European Parliament and of the Council (EU) 2022/2065 of October 19, 2022, on the digital services single market and amending Directive 2000/31/EC (Digital Services Act) (“DSA”). ↩︎
Recital (18) DSA. ↩︎
Recital (15) DSA. ↩︎
Recital (67) DSA. ↩︎
A trusted flagger is a person with granted status. According to Article 22 DSA, only an entity that is a legal person, which has special expertise and capability for the purposes of detecting, identifying, and reporting illegal content, is not dependent on any online platform provider, and performs the activity of submitting notifications consistently, accurately, and objectively, can become a trusted flagger. ↩︎
Recital (12) DSA. ↩︎