🚀 New Holistic AI Tracker 2.0 – Now featuring Atlas, a heat-mapped world view tracking AI legislation, incidents, and regulations in real time.
Register for free
Learn more about EU AI Act
Join Webinar: Bias Detection in Large Language Models - Techniques and Best Practices
Register Now
Learn more about EU AI Act

Navigating the 2024 Online Safety Landscape: A Comparative Analysis of the EU’s Digital Services Act and UK’s Online Safety Act

Authored by
Siddhant Chatterjee
Public Policy Strategist at Holistic AI
Published on
Feb 15, 2024
read time
0
min read
share this
Navigating the 2024 Online Safety Landscape: A Comparative Analysis of the EU’s Digital Services Act and UK’s Online Safety Act

2024 is set to play a crucial role in shaping the landscape of online safety, propelled by a significant surge in risks and hazards infiltrating diverse social media platforms. Attributed to a perceptible upswing in concerning occurrences of live broadcasts of shootings, online radicalization of individuals, mass AI-generated content, the widespread dissemination of misinformation and disinformation during elections and crises, and the exacerbation of digital well-being issues due to dark, black-box-like algorithmic patterns, governments across the globe have escalated their efforts, enacting a series of online safety legislations aimed at mitigating these harms.

Notably, in the European context, two significant legislations have taken centre stage: the European Union's Digital Services Act (DSA) and the United Kingdom's Online Safety Act (OSA). With the imminent application of obligations under the DSA effective from the 17th of February 2024 – this blog delves into both these legislations, offering an in-depth exploration and a comparative analysis of some of their key provisions.

We’ll tackle who these laws apply to, and what you can do to begin preparing for these new legislations that present large fines in instances of non-compliance.

Key Takeaways:

  1. The EU Digital Services Act (DSA) seeks to establish comprehensive accountability and transparency, harmonize national rules, and impose special obligations including audits on Very Large Online Platforms (VLOPs) and Very Large Search Engines (VLOSEs) to ensure safe online environments, with enforcement by the European Commission.
  1. The UK's Online Safety Act (OSA) focuses on online safety, including user-to-user services, search engines, and explicit content. The legislation emphasizes a "duty of care," categorizing entities by reach and risk, and granting exclusive regulatory authority to Ofcom, which is empowered to issue codes of conduct, conduct investigations, and impose fines for non-compliance.
  1. Enforcement of the DSA involves Digital Services Coordinators and European Commission oversight for VLOP/VLOSE compliance, while the OSA is enforced by Ofcom, categorizing entities by reach and risk without size-based considerations
  1. The DSA has broad coverage of digital services and illegal content, whereas the OSA has a narrower focus categorizing illegal content into general and priority offenses addressing specific online harms.
  1. The DSA permits fines up to 6% of global annual turnover, while the OSA grants Ofcom the authority to impose fines up to 10% of global annual turnover, including potential criminal sanctions for non-compliance.

What is the EU Digital Services Act (DSA)?

Seeking to ensure safe online environments for users, the Digital Services Act (DSA) outlines a comprehensive accountability and transparency regime for a variety of digital services and platforms operating in the European Union (EU). Encompassing a spectrum of digital platforms – from online marketplaces and Internet Service Providers (ISPs) to social media platforms and search engines, the DSA harmonizes the diverging national rules of European Member States that had emerged under the E-Commerce Directive of 2000. This legislation, coupled with the Digital Markets Act and upcoming EU AI Act – seeks to usher in a new regulatory paradigm for the effective governance of digital technologies in the European Single Market.

Requirements on digital services presented by the DSA

Taking a staggered approach towards mitigating systemic risks arising from the use of digital platforms, the DSA imposes cumulative obligations (Articles 11 through 48, housed in Chapter III) on intermediary services that fall under the definition of Hosting Services, Online Platforms and Very Large Online Platforms (VLOPs) and Search Engines (VLOSEs). The provisions apply differently depending on the nature of the intermediary service, with VLOPs and VLOSEs – platforms with more than 45 million Monthly Active Users in the EU – subject to the most stringent requirements. The following diagram illustrates the many obligations for covered entities under the DSA.

DSA - Obligations | A staggered approach

In addition to complying with general Due Diligence requirements applicable to all intermediary services, the 22 VLOPs and VLOSEs – including the likes of YouTube, Google Search, Instagram, and Amazon – as well as new entrants like PornHub are subjected to a set of special obligations outlined in Articles 34 to 48 of the DSA. These came into effect in late August 2023, in line with the European Commission’s mandate for these to be enforced six months from the Date of VLOP/VLOSE designation.

To fulfil these obligations, the DSA introduces a set of novel tools – ranging from comprehensive risk assessments, allowing users to opt-out from personalized recommendations, mandatory algorithmic explainability and transparency, to even providing vetted researchers access to platform data through API access. Most notable among these requirements, however, is the provision for subjecting platforms to independent audits of their safety and integrity efforts, under Article 37.

Who will enforce the DSA?

The enforcement of the DSA will primarily be carried out by competent authorities at the national level and their corresponding Digital Services Coordinators. However, when it comes to overseeing VLOPs and VLOSEs, the Commission will have exclusive authority to enforce obligations specific to these entities. The Commission will collaborate with Digital Services Coordinators to oversee how these platforms address systemic risks that may arise on their platforms.

Online Safety Act

Having received Royal Assent on October 26, 2023, the Online Safety Act (OSA) is the United Kingdom’s de-facto legislation on online safety, covering a broad range of risks presented by online platforms. Similar to the DSA in this regulatory scope and aims, the OSA explicitly covers three categories of user-to-user services, which include:

  • Internet services that allow users to encounter, upload, and generate content
  • Providers of search engines
  • Services that publish or display pornographic content

An important aspect of the OSA regime is the “duty of care”, which mandates covered entities to take requisite measures to protect users from harmful content. This aspect essentially moulds the OSA to focus on the underlying trust and safety processes and programs an entity deploys, as opposed to a sole focus on moderating and removing violative content.

Requirements on digital services presented by the OSA

Under the OSA, covered entities will be subjected to obligations determined by the extent of their reach and the risks they might present. These translate to 3 broad regulatory categories, namely:

  1. Category 1: the highest reach user-to-user services with the highest risk functionalities, with transparency requirements, a duty to assess risks to adults of legal but harmful content, requirements relating to fraudulent advertising and a variety of other duties.
  1. Category 2a services: the highest reach search services, with transparency and fraudulent advertising requirements, and
  1. Category 2b services: other services with potentially risky functionalities or other factors, with transparency requirements, but no other additional duties.

As the primary legislation, the OSA sets the groundwork for the development of targeted secondary legislation aimed at addressing specific issues, such as child safety, violent extremism, and the dissemination of harmful or illegal content.

The legislation grants exclusive regulatory authority to the Office of Communications (Ofcom), which will fulfil this role by issuing dedicated codes of conduct pertaining to various areas of harm in line with primary and secondary legislation that covered entities must adhere to. Additionally, Ofcom will be empowered to carry out investigations and information requests to better understand how entities are addressing these risks, routinely engage with services to supervise their trust and safety efforts, and finally, levy fines and penalties in instances of non-compliance.

Entities subject to regulation must adhere to these codes. Additionally, Ofcom will use its information gathering powers, and conduct investigations to gain insight into how entities are managing these risks, as well as engage with services to oversee their efforts in maintaining trust and safety standards. Finally, Ofcom is empowered to impose fines and penalties in cases of non-compliance.

Comparing the DSA and OSA: Key Differences and Similarities

Aspect of Regulation
Digital Services Act (DSA)
Online Safety Act (DSA)
Scope and Purpose
  • Broad Coverage: Wide variety of Digital Services, covering ISPs, Online Marketplaces, Social Media platforms, search services and hosting services, with a focus on mitigating illegal content, intellectual property, dark algorithmic patterns, and illegal goods, among others.
  • Extraterritorial application
  • Narrow Coverage: Covers certain user-to-user services like social media and video-sharing platforms and search engines, with a focus on granularly tackling specific online harms like terrorism and extremism, and child safety, among others.
  • Extraterritorial application
Scope and Articulation of Illegal Content
  • The DSA provides a flexible definition of illegal content including actions currently illegal under member state laws, EU treaties, and EU-wide legislation.
  • The DSA applies equally to all types of illegal content.
  • The OSA categorizes illegal content into general offenses and "priority offenses, covering child exploitation, terrorism, etc with detailed specifications
  • The OSA takes a more tailored approach to different types of illegal content.
Enforcement By various Digital Services Coordinators and national competent authorities. VLOP and VLOSE compliance is under the European Commission’s purview. Dedicated sectoral regulator – Ofcom appointed to enforce the OSA.
Regulatory Categorisation
  • Providers of intermediary services
  • Providers of Hosting Services
  • Online Platforms
  • Very Large Online Platforms (VLOPs) and Search Engines (VLOSEs)
  • Obligations on platforms are proportional to the size of their operation.
  • Category 1 Services (highest reach user-to-user services with risky functionalities)
  • Category 2a Services (highest reach search services)
    Category 2b Services (other services with potentially risky functionalities)
  • Obligations on platforms are proportional to their reach and risk profile. Not dependent on size of operation.

Risk Assessment Obligations
Risk Assessments under the DSA deal with mitigating systemic risks arising from the design, functioning and use of platforms Risk Assessments under the OSA are concerned with identifying, mitigating and managing illegal content
Independent Audit Provisions Mandates annual, independent and cooperative audits for VLOPs and VLOSEs under Article 37, with a delegated regulation providing implementational details on the same. Mandates Ofcom to carry out algorithm assessments.
Fines and Penalties The DSA empowers the European Commission to:
  • Levy fines up to 6% of the worldwide annual turnover in case of:
    • Breach of DSA obligations
    • Failure to comply with interim measures
    • Breach of commitments
The OSA empowers Ofcom to:
  • Levy fines of up to £18 million or 10% of global annual turnover (whichever is higher) or apply to court for business disruption measures (including blocking) non-compliant services;
  • Bring criminal sanctions against executives in instances of non-compliance

Manage Risks, Get Compliant

The global momentum for online safety regulation is gaining speed. Regardless of their size, companies need to implement a strong combination of proactive and reactive strategies. This is essential not only for compliance with regulations but also to reduce instances of harm on their platforms. One effective approach is to subject a company's trust and safety tools and processes to independent audits—a measure increasingly recommended by regulations like both, the DSA and OSA.

As leaders in the emerging fields of AI Assurance and Algorithm Auditing, Holistic AI provides comprehensive and tailored solutions to support affected entities with such compliance obligations. We do so by considering several factors, such as the complexity and novelty of conducting such audits, the need to deploy socio-technical methods to audit certain provisions, and the timeline for the application of obligations, among others.

Particularly for Independent Audits for VLOPs and VLOSEs under the DSA, Holistic AI provides the following services:

  • An independent annual audit of due diligence obligations set out in Chapter III (Articles 11 to 48) of the Digital Services Act (DSA)
  • Compliance with any commitments undertaken pursuant to codes of conduct or crises protocols, where applicable, and
  • A Final Audit Report, in line with the guidance and template provided by the Draft Delegated Regulation at the end of the audit period, including:
    • Audit Conclusions and operational recommendations on measures (with timeframes for to achieve compliance) for corresponding audited obligations,
    • Cumulative Audit Opinion for the report, assessing the entity’s compliance with all audited obligations, as mentioned in Article 37(1(a)) of the DSA
    • An explanation of the circumstances and reasons why certain elements could not be audited, if applicable
    • Audits Risks Analysis on inherent risks, control risks and detection risks to ensure compliance with the regulation
    • Methodologies, criteria and other technical and operational details of implementing these audits
    • Any other information, as required by the Draft Delegated Regulation on Independent Audits
    • A full version (for EU authorities and internal use) and a redacted/shortened version (for publication) of the Final Report, as required under Articles 37 and 42(5) of the DSA

To begin exploring the intersection of our trustworthy AI platform and your initiatives, schedule a demo today.

DISCLAIMER: This blog article is for informational purposes only. This blog article is not intended to, and does not, provide legal advice or a legal opinion. It is not a do-it-yourself guide to resolving legal issues or handling litigation. This blog article is not a substitute for experienced legal counsel and does not provide legal advice regarding any situation or employer.

Subscriber to our Newsletter
Join our mailing list to receive the latest news and updates.
We’re committed to your privacy. Holistic AI uses this information to contact you about relevant information, news, and services. You may unsubscribe at anytime. Privacy Policy.

See the industry-leading AI governance platform in action

Schedule a call with one of our experts

Get a demo