What are digital services?
Digital services include a broad category of online services, ranging from simple websites to internet infrastructure services and online platforms. The rules of the Digital Services Act mainly apply to online intermediaries and platforms. These include online marketplaces, social networks, content sharing platforms, app stores and online travel and accommodation platforms.
What is the Digital Services Act?
The Digital Services Act is a single set of rules that apply across the EU with the aim of creating a safer digital space where the fundamental rights of all users of digital services are protected.
What is a Digital Service Coordinator?
The Digital Service Coordinator is the body responsible for applying and enforcing the European Digital Services Act in each Member State. Slovenia has established by law that the tasks of the Digital Service Coordinator are to be taken over by the Agency.
What are the tasks of the Digital Service Coordinator?
Together with the European Commission, they contribute to monitoring the implementation of the Digital Services Act. They are in charge of monitoring the platforms in their country, can carry out inspections and impose fines in case of infringements. The coordinator, which in Slovenia is the Agency, also grants status to organisations or institutions that declare themselves as trusted whistleblowers and identify and report illegal content. The Coordinator also designates out-of-court dispute resolution bodies to take decisions on disputes between users and online service providers. The Coordinator will forward complaints from Slovenian users against providers from any EU country or providers operating in the EU to other competent coordinators.
What is the relationship between national authorities and the European Commission in enforcing the DSA?
The Digital Service Coordinators in each Member State are responsible for monitoring and enforcing the DSA at national level. The European Commission plays a central role in coordinating activities, especially in cases involving very large online platforms.
What is illegal content?
The Digital Services Act provides a pan-European framework for detecting, reporting and removing illegal content, although it does not define what is illegal online per se. What constitutes illegal content is defined in other laws at EU level or at national level in individual sectoral laws. For example, terrorist content, child sexual abuse videos or illegal hate speech are defined at EU level. Where content is illegal only in a particular Member State, it should as a general rule only be taken down in the territory where it is illegal.
What does "algorithmic transparency" mean?
Platforms must disclose how their algorithms work, in particular for recommendation systems and advertising. They must allow users to adjust the settings of their algorithms according to their preferences in order to ensure better transparency and protection of rights.
How does the Digital Services Act regulate content moderation on online platforms?
Online platforms must have clear terms and conditions that define the rules for moderating content. When removing or restricting access to content, platforms must inform users of the reasons for this action and give them the opportunity to complain or mediate. They are also obliged to ensure transparency in the use of automated moderation tools.
What do service providers in Slovenia need to ensure ?
All providers must provide a single point of contact - an easily accessible contact point where users can electronically report content they consider illegal. Platforms must handle these complaints. For users whose content is removed, or whose visibility of publication or payment is restricted, platforms must explain why they have done so and provide them with the opportunity to complain or mediate. Platform providers must also ensure that advertisements are clearly labelled and that the advertiser is clearly identifiable.
What additional obligations do very large online platforms (VLOPs) have?
Very large online platforms (VLOPs) with more than 45 million monthly active users in the EU have the following additional obligations: conducting regular assessments of systemic risks (e.g. misinformation, illegal content), putting in place risk mitigation measures, independent external audits, providing access to data for vetted researchers, and additional obligations regarding advertising transparency and algorithm performance.
Who resolves disputes between users and platform providers?
Users of online platforms have the possibility to use out-of-court dispute resolution to resolve disputes relating to decisions taken by the online platform provider, including those that could not be satisfactorily resolved within the provider's internal complaints handling system. They may do so with certified bodies that are independent and have the necessary resources and expertise to carry out their activities fairly, quickly and cost-effectively. These bodies are certified by the Agency as Digital Service Coordinator, which also issues them with an out-of-court dispute resolution certificate and will publish them on its website.
When can you contact the Agency as Digital Service Coordinator and when can you contact platforms directly, even very large ones based outside Slovenia?
It is envisaged that users will contact the Agency when they consider that a platform is not complying with its obligations under the Regulation, for example if it does not provide users with a prominent place to report illegal content electronically, or if it fails to respond to users and explain the interference with their content on the platform. This means that, as in other countries, users must first contact the platforms. Users should also contact the platforms directly to appeal against their decisions.
What to do if a user sees content that they think is illegal?
If users believe that information constituting illegal content has been published on an online platform, they should report it to the competent authorities, the police or file a complaint directly with the provider. They can report the content to a trusted whistleblower. The Agency will publish the list of organisations to which it will grant the status of Trusted Whistleblower on its website.
So what can residents report to the Agency and how?
As in other countries, users will first be able to contact the platforms. Anyone who believes that a provider has breached any of the obligations for digital service providers can lodge a complaint with the Agency. The Agency will investigate the complaint, deal with it appropriately and, if necessary, forward it to the competent authorities in Slovenia or to the competent digital services coordinator in another EU Member State.
You can send your complaint by e-mail to info.box@akos-rs.si or to the Communications Networks and Services Agency of the Republic of Slovenia, Stegne 7, 1000 Ljubljana.
You can also make a complaint orally on the record at the Agency, during office hours, i.e. from Monday to Friday from 9 to 11 a.m., and on Wednesdays from 1 to 2 p.m.
Who are the trusted applicants?
Trusted Whistleblowers are legal entities, whether private or public law, which have specific expertise and competences for the purpose of detecting, identifying and reporting illegal content and are independent of online platforms. Online platforms must ensure that applications submitted by Trusted Whistleblowers are prioritised and dealt with swiftly.
How can organisations apply for Trusted Whistleblower status?
According to the Digital Services Act, Trusted Applicant status is a status granted by the Digital Services Coordinator in the Member State where the applicant is established. To be successful, the applicant must:
- have specific expertise and training for the purpose of detecting, identifying and reporting illegal content;
- be independent from any online platform provider;
- carry out the activities for the purpose of submitting applications in a diligent, accurate and objective manner.
Who are the verified researchers and what is their job?
Vetted researchers are experts or institutions that have a mandate to access data from very large online platforms. Their task is to study systemic risks, such as the impact of platforms on democratic processes, public health or disinformation. These researchers help to improve the transparency and accountability of platforms.
How does the Agency ensure that users are informed of their rights?
The Agency for Communication Networks and Services regularly publishes guidelines, lists of certified bodies for out-of-court dispute resolution, a list of trusted whistleblowers and other relevant information on its website.
For further questions, please contact: info.box@akos-rs.si