INHOPE | Digital Services Act Explained
Article
Industry News & Trends
Educational Articles

Digital Services Act Explained

Last year, the European Union enacted a new set of regulations to modernise and harmonise guidelines for digital services in EU Member States. These regulations, known as the Digital Services Act (DSA) aim to provide improved digital safeguards for all users and consumers of digital goods and services. But what does the DSA actually entail, how does it protect consumers' rights and what are the implications for online platforms?


By February 2024, the DSA will apply across the entire EU, by complementing national legislation in EU Member States. Each Member State will need to appoint a Digital Services Coordinator, an independent authority which will be responsible for supervising the implementation and enforcement of the Regulation. The Coordinator will have the power to impose fines and penalties for non-compliance with the regulation to ensure accountability, and transparency in online spaces.

How does the DSA protect consumers' rights?

Through its new rules, the DSA aims to better protect consumers and their fundamental rights online. This includes increased legal priority for Trusted Flaggers, safeguards for young digital users that address illegal content, and reporting and removal processes.

Trusted Flaggers

  • The DSA has given Trusted Flaggers an increased priority status, by creating a national legal basis for Trusted Flagger recognition in the European Union.
  • The status of Trusted Flagger will be awarded by the Digital Services Coordinator.
  • Reports made by Trusted Flaggers will have to be prioritised by online service providers.
  • This creates a legal obligation for companies to rapidly act on reports received by recognised organisations like hotlines and helplines. Find out more about Trusted Flaggers



Reporting & Removal

  • Enhanced rights for users reporting content, including products on digital service platforms. Platforms will be obligated to inform users of any decision taken after a report is made, including the reason behind their decision. Additionally, a mechanism to contest the decision must be provided.
  • When enabled by national laws, Member State authorities can order any platform operating in the EU, irrespective of where they are established to remove illegal content. This process aims to enhance the efficiency of content removal.

Minor Safety

  • All online platforms are obligated to remove child sexual abuse material (CSAM) and other non-consensual sexual content promptly. Failure to comply with removal can lead to a fine of up to 6% of global turnover.
  • Platforms will have to redesign their systems to ensure a high level of privacy, security, and safety of minors.
  • Very large online platforms (VLOPS), with over 45 million monthly active users are no longer permitted to create targeted advertising by profiling children.

Implications for Online Platforms

Differences in rules and regulations will be made according to platform size, with the toughest rules applying to VLOPs. As certain VLOPs have demonstrated high power in influencing public opinion, politics and discourse, these rules were implemented to mitigate systemic risks like manipulation and disinformation through digital services. As of this moment, 19 VLOPs have been identified that will have to oblige with the most strict regulations, starting four months after their designation. Some of the most significant rules include:

  • VLOPs have to regularly assess the potential risks their systems pose to factors like online content, products and systematic risks to the protection of public interests, fundamental rights, public health and security.
  • Regulators must be informed of the details behind the platforms' algorithmic processes.
  • The European Centre for Algorithmic Transparency (ECAT) will provide the Commission with in-house technical and scientific expertise to ensure that algorithmic systems used by the very large online platforms and search engines comply with the risk management, mitigation and transparency requirements in the DSA.
  • Platforms have to take proactive measures against harmful online practices on their services and publish transparency reports on content moderation decisions and risk management.

To comply with the DSA, swiftly respond to emerging threats and maintain a safer online environment for their users, companies might have to enhance resources for trust and safety. After the recent surge of Trust and Safety layoffs across the tech industry, this might re-introduce higher investment into the online safety sector. Investing in trust and safety is not only beneficial for user safety but can promote the growth of digital platforms. Read more about these benefits here.


The DSA is a critical step taken by the EU to ensure harmonised protection of people’s fundamental rights online. It’s a clear stance to prioritise Trust and Safety on online platforms and showcases the importance of investing in Trust and Safety teams. Should your company have a Trusted Flagger program? Learn more about the ins and outs of Trusted Flagger programs.

Why are INHOPE member hotlines the ideal Trusted Flagger candidate? Click here to find out.

Digital Services Act Explained

'

The DSA is a critical step taken by the EU to ensure harmonised protection of people’s fundamental rights online.

'