Skip to main content

General

Ensuring a safe and responsible online environment is one of the key priorities of the European Union. Regulation (EU) 2022/2065 on a single market for digital services and amending Directive 2000/31/EC (Digital Services Act), which entered into full application on 17 February 2024, sets new standards for the liability of intermediary service providers in relation to illegal content and other societal risks. It gives online users across the EU market more control and protects their fundamental rights, while giving society as a whole greater control over platforms and reducing systemic risks.

The Digital Services Act, which in practice is referred to by its English acronym DSA (as the Digital Services Act), applies to a large group of online service providers. These include internet access providers, domain registries and registrars, hosting providers (including cloud services) and online platforms such as search engines, social networks, video sharing portals, online forums and chat rooms or online marketplaces. The Digital Services Act or DSA imposes certain due diligence obligations on all of these to ensure a transparent and secure online environment, the scope of the obligations depending on the type, size and nature of the service itself.

At national level, the implementation of the Digital Services Act was regulated by the Act on the Implementation of the (EU) Regulation on the Single Market for Digital Services (ZIUETDS; Official Journal of the Republic of Slovenia, No 30/24), which assigned the tasks of the Digital Services Coordinator to the Agency for Communication Networks and Services of the Republic of Slovenia. The Agency is thus responsible for the implementation of the DSA and for the supervision of intermediary service providers established in its country. An exception applies to the provisions of Article 26(1) and (3) and the whole of Article 28 of the DSA, where supervision is carried out by the Information Commissioner.

Supervision of very large online platforms (such as Facebook, Instagram, X, TikTok, Amazon, AliExpress, Temu, AppStore or Google Play) and very large online search engines (such as Google or Bing) is carried out by the European Commission itself. A list of these providers can be found here.

 

What obligations does the DSA impose?

The DSA's central guiding principle is: what is illegal must also be illegal online. Whether it is dangerous or illegal products online, scams and/or fraud, the protection of minors online, the misuse of personal data or online violence, the aim is to give users more choice, better information and easier reporting of this type of content. More detailed information on the types of all illegal content can be found here.

Obligations imposed by the DSA on service providers include, but are not limited to:

  • Designating a contact point that allows direct communication by electronic means, both with recipients of the service and for EU Member State authorities,
  • setting and enforcing clear terms of use in cases of content moderation.

Additional obligations imposed on online platform providers include, inter alia:

  • Allowing users to report illegal content or complain about content that has already been moderated,
  • giving priority to reports received from trusted whistleblowers, and participating in dispute resolution procedures with out-of-court dispute resolution bodies,
  • a ban on profiling in advertising targeted at minors or involving sensitive personal data,
  • publication of the number of monthly users and more detailed transparency reports.

Very large online platforms and very large search engines, which reach more than 45 million monthly users in the EU, are subject to the strictest rules. Their obligations include:

  • Taking measures to manage systemic risks arising from their services,
  • allowing users to choose the option of a non-profiling based recommendation system
  • publishing repositories of all advertisements on their interfaces, including information on advertisers,
  • allowing access to data for vetted researchers and the general public.

The listed service providers are also required to provide annual risk assessments to consider how their services could pose risks to e.g. fundamental rights, civil discourse or public health impacts; and consequently to take reasonable and effective measures to mitigate identified systemic risks.

More detailed requirements for intermediary service providers can be found here.

Accessibility(CTRL+F2)
color contrast
text size
highlighting content
zoom in