White & Case LLP | Rory HishonJenna Rennie | Joseph Carroll | Ben Harris

European Union

Legal framework

Legal regime

Does your jurisdiction have a legal regime governing or addressing online safety? If so, how does it operate?

The European Union (the EU) has enacted a comprehensive legal regime for online safety in the form of Regulation (EU) 2022/2065, the Digital Services Act (DSA), which became generally applicable in February 2024 (or in 2023 for certain designated providers).

The DSA is directly effective across all EU member states and establishes a harmonised set of rules for providers of online intermediary services, including mere conduit services, caching services, hosting services, online platforms, very large online platforms (VLOPs) and very large online search engines (VLOSEs). It aims to create a ‘safe, predictable and trusted online environment’ by imposing a range of obligations on providers, depending on the nature and number of users of their services in the EU. For VLOPs and VLOSEs, these obligations include assessing and mitigating risks arising from illegal content and other systemic risks to users, including risks to children and to users’ fundamental rights.

Oversight and enforcement of the DSA are carried out by national digital services coordinators (DSCs) appointed by each member state, while the European Commission (the Commission) acts as the primary regulator for VLOPs and VLOSEs. The DSA is supplemented by Commission guidelines, delegated acts, and codes of conduct. The regime is designed to be risk-based and proportionate, scaling obligations according to the size, reach and risk profile of each service. Providers of certain services are required to publish annual transparency reports and undergo independent compliance audits.

The DSA is not the only source of EU law that may impact the online safety landscape. Other legal instruments contribute to a safer online environment by targeting particular risks and strengthening protective measures across digital services. For example, the General Data Protection Regulation (the GDPR) is primarily designed to safeguard personal data and privacy, and the Terrorist Content Online Regulation (the TCO) deals with the dissemination of terrorist content online. However, given the number and variety of such laws, and the fact that they narrowly relate to particular issues or aspects of online safety, we have focused below on the DSA, as the EU’s principal legislation designed to address online safety more broadly.Online harms covered

Which online harms are covered under the relevant legislation and how are these harms defined?

The DSA addresses a range of online harms, with particular focus on illegal content and ‘systemic risks’ that may arise from the use of online platforms. Illegal content is defined as any information that is unlawful under EU or member states’ national law (such as terrorism, child-sexual-abuse material, hate speech and intellectual property infringement). Systemic risks include the dissemination of illegal content but extend more broadly to include risks to fundamental rights, risks to civic discourse, electoral processes and public security, and risks to public health, minors and physical or mental wellbeing. In addition, there is an express provision addressing protection of minors and the Commission has recently published guidelines giving examples of potential harms to minors, such as content promoting self-harm, suicide, eating disorders or extreme violence.

However, the DSA does not provide an exhaustive list of illegal content or systemic risks. VLOPs and VLOSEs must conduct risk assessments (at least once a year) to identify for themselves any systemic risks stemming from the use, design or functioning of their service and its related systems.Online services covered

Which online services are covered under the law and how are these services defined?

The scope of the DSA is intentionally broad, covering a wide array of online services that are accessible to users in the EU. These include intermediary services (ie, mere conduit, caching and hosting services), and online platforms that store and disseminate information to the public at the request of users (eg, social media networks, online marketplaces and app stores). Online platforms and search engines that have more than 45 million monthly active users in the EU may be designated by the Commission as VLOPs or VLOSEs, in which case, they are subject to additional obligations under the DSA.

Importantly, the DSA applies to any provider whose services are offered to EU users, regardless of where the provider itself is established. Providers established outside the EU must appoint a legal representative within the EU to engage with the Commission and DSCs.Territorial scope

What is the territorial scope of the relevant law?

The DSA has broad territorial reach. It applies to any provider offering online intermediary services to users in the EU, regardless of the provider’s location or place of establishment, and will, therefore, have extraterritorial effect for non-EU providers.Codes of practice

Are there any codes of practice or other non-binding guidelines or recommendations relating to online safety in your jurisdiction?

The DSA is supplemented by a range of non-binding codes of conduct and Commission guidelines. These typically provide detailed recommendations on best practices for DSA obligations, and play a significant role in shaping compliance practices and regulatory expectations. For example, in recently published guidelines concerning obligations related to minors’ safety, the Commission stated that it would use the guidelines to ‘impose a limit on the exercise of its discretion whenever applying’ the relevant provisions of the DSA.

While adherence to DSA codes of conduct is voluntary, compliance can serve as a mitigating factor in enforcement proceedings and is considered evidence of good-faith efforts to meet DSA obligations. The codes and guidelines may be updated to reflect emerging risks and technological developments, ensuring that the regulatory framework remains responsive and effective.Harmful versus illegal content

How does the law in your jurisdiction distinguish between harmful and illegal content?

The DSA draws a clear distinction between illegal content (meaning any content that is prohibited under EU or national member state law) and other content that service providers are expected to address. As an example, article 16 requires providers to establish a ‘notice and action’ mechanism for users to report content that the report considers to be illegal. This requirement only applies to illegal content – it does not extend to other types of harmful but legal content (eg, content that is offensive or that violates the service’s terms and conditions).

However, while the DSA does not define ‘harmful’ content per se, it includes a number of requirements that are relevant to providers’ handling of such content. For example, article 28 DSA requires providers to implement ‘appropriate and proportionate’ measures to ensure a high level of safety, security and privacy for children, which would likely include measures to protect children from harmful content. Additionally, providers of VLOPs and VLOSEs must assess and mitigate systemic risks – such as threats to electoral processes or harm to vulnerable users – which will include risks that arise from legal but ‘harmful’ content.Extremist and terrorism-related content

How does your jurisdiction regulate the dissemination of extremist and terrorism-related content online?

The DSA’s duties relating to illegal content – including the notice and action mechanisms – apply to any terrorism-related content that amounts to an offence under EU or national member state law.

The DSA also requires VLOPs and VLOSEs to assess and mitigate systemic risks associated with the dissemination of terrorist content. This includes evaluating how their services might be used to spread extremist material and implementing measures to prevent such risks.

Additional requirements are imposed for certain providers under terrorism-specific legislation, particularly Regulation (EU) 2021/784 on addressing the dissemination of terrorist content online (the TCO). The TCO defines terrorist content as material that incites, solicits or contributes to terrorist offences. The duties imposed by the TCO include requirements to take proactive measures to prevent terrorist content appearing on a service, and to remove such content within one hour of receiving a removal order from a competent authority.Disinformation versus misinformation

How, if at all, does the law in your jurisdiction distinguish between misinformation and disinformation online? Does it include malinformation?

The DSA recognises the challenges posed by both disinformation and misinformation online and requires VLOPs and VLOSEs to assess and address these issues as part of their systemic risk assessment and mitigation obligations. While not expressly defined under the DSA, ‘disinformation’ is understood as false or misleading content that is disseminated with the intent to deceive or cause harm, while ‘misinformation’ refers to false or misleading content shared without malicious intent.

The Commission has issued guidelines for VLOPs and VLOSEs on the mitigation of systemic risks for electoral processes under article 35 of the DSA, which emphasise tackling disinformation and misinformation during electoral processes. The guidelines recommend that platforms implement robust mitigation measures, such as providing users with access to official electoral information, collaborating with independent fact-checkers, and clearly labelling or demoting content identified as false or misleading. They also encourage platforms to support media literacy initiatives to help users recognise and resist disinformation, and to adapt recommender systems to reduce the amplification of deceptive content. These measures are to be balanced with the protection of fundamental rights.

In addition, the Code of Conduct on Disinformation (COCD) sets out voluntary commitments for service providers to tackle the spread of false and misleading information online. For example, signatories commit to preventing the misuse of advertising systems to spread disinformation, strengthening efforts around media literacy and empowering users with tools to help them assess the provenance of digital content. Although adherence to the COCD remains voluntary, platforms that sign up are expected to implement its measures in good faith and report regularly on their progress. The DSA encourages participation in such codes as part of a broader strategy to foster accountability and transparency in the digital ecosystem.

This article first appeared on Lexology | Source