One year of DSAHow does the Digital Services Act affect our internet?

The Digital Services Act (DSA) has been in force in the EU for a year now. The DSA is referred to in many current debates on internet security. For example, when it comes to disinformation campaigns. Or the worrying influence of individuals such as Elon Musk or Mark Zuckerberg on large social media platforms. What exactly does the DSA do? What dangers does the law protect us from? And what are the allegations of censorship all about?

The Digital Services Act (DSA for short) is a regulation that has been legally binding for all companies on the Internet since February 17, 2024. It does not matter whether the online provider is based in the European Union (EU) or not. As soon as they offer their service in the EU, they must comply with the requirements of the DSA. With the DSA, the European Union wants to ensure that the rights of all people are protected by internet platforms. All citizens should be able to trust the digital services they use. And everyone should be able to navigate the internet safely, regardless of which digital services they use. The DSA is usually reported in the media in relation to social media platforms. However, the DSA also applies to large online marketplaces and search engines.

In Germany, the DSA has been translated into national law as the Digital Services Act. The coordinating authority is the Federal Network Agency. As an independent coordinating body (Digital Service Coordinator), it is responsible for overseeing compliance with the regulations. Citizens can also lodge a complaint here if they believe they have found a breach of the DSA.

How does the DSA ensure that everyone is protected?

Under the DSA, large online platforms in particular are required to regularly demonstrate that they are taking appropriate precautionary measures to prevent illegal or harmful content on their services. To this end, they must, for example, submit risk reports showing what risks exist and how the platform is responding to them. Various bodies, such as the Federal Network Agency in Germany or the European Commission, check whether the services meet their obligations in an appropriate manner.

The German translation of the DSA comprises 102 pages. This already shows that the regulation is a very comprehensive set of rules. We will give three examples here of how the DSA affects our use of the Internet. In addition to the examples mentioned here, there are many other topics that are affected by the DSA. These include, for example, data protection, algorithmic recommendation systems and manipulative design elements (so-called dark patterns).

Hate speech
Dealing with hate speech is not regulated by the DSA itself, but by the "Code of Conduct for Combating Illegal Hate Speech on the Internet+". This code of conduct was integrated into the DSA in January 2025. The code was developed in its original form back in 2016 and has been signed by many major online platforms such as Facebook, TikTok and YouTube. The code stipulates, for example, that illegal hate speech must be removed within 24 hours. The platforms must also work together with suitable organizations, for example from civil society, to develop suitable concepts against illegal hate speech.

Disinformation
When it comes to disinformation, the DSA also refers to a code of conduct that has been in place for some time. The "Code of Conduct for Combating Disinformation" has been in force since 2018 and was revised and strengthened again in 2022. Since February 2025, this code of conduct has now also been officially integrated into the DSA.
The Code of Conduct stipulates that online platforms must take appropriate measures against the spread of disinformation. They must also regularly report on the risks on their platforms and how they intend to minimize them.

Measures to protect children and young people
An important point of the DSA is the protection of children and young people on the internet. According to the DSA, all online platforms that are used by children must take suitable and proportionate measures to protect them. One example of this is the provision of accounts for minors that offer the highest level of privacy and security. Platforms must also not use children's personal data to display advertising to them. You can also find more information on this topic in our article "What does the new Digital Services Act mean for children?".

Further information on the DSA and how it guarantees the protection of children and young people can be found in the European Union's publication "What is the Digital Services Act (DSA)?".

Does the DSA really restrict freedom of expression?

Recently, there have been repeated claims that the DSA is the basis for censorship of freedom of expression in Europe. The claim that censorship prevails on the internet in Europe became very prominent in a video message from Mark Zuckerberg, who operates the Facebook, Instagram, Threads and WhatsApp platforms. The US Vice President also made critical comments at the Munich Security Conference in February 2025, suggesting that the EU was threatening to shut down its own population if unwelcome content was posted on social media.

It is true that the DSA specifies which content must be deleted from platforms. However, this only concerns illegal content such as hate speech, disinformation and abusive images of minors. Nowhere in the DSA does it say that regular content such as opinion pieces should be deleted. The fact that illegal content such as insults, calls for violence and incitement to hatred are not covered by freedom of expression and that there is therefore no right to be able to disseminate this content unhindered everywhere has long been established practice in Germany and the EU and is not a new requirement of the DSA.

An important point that is hardly mentioned in the debate: The DSA also protects users from the arbitrary deletion of their content and accounts. The DSA obliges platforms to transparently state why content has been removed. Even if the content is only restricted (e.g. due to less reach or if it is excluded from monetization), the platform must justify its actions to users. And the DSA obliges platforms to inform users in a clear and transparent manner about the options they have to defend themselves against the decision. For example, through an internal complaints management procedure, through out-of-court dispute resolution or through legal remedies. So if a mistake really has been made when deleting content, the DSA guarantees that EU citizens can defend themselves against it. Important: The rating and deletion of content is always carried out by the platforms themselves. Neither the European Commission nor the Federal Network Agency have the authority to delete content on the internet.