2024 is set to play a crucial role in shaping the landscape of online safety, propelled by a significant surge in risks and hazards infiltrating diverse social media platforms. Attributed to a perceptible upswing in concerning occurrences of live broadcasts of shootings, online radicalization of individuals, mass AI-generated content, the widespread dissemination of misinformation and disinformation during elections and crises, and the exacerbation of digital well-being issues due to dark, black-box-like algorithmic patterns, governments across the globe have escalated their efforts, enacting a series of online safety legislations aimed at mitigating these harms.
Notably, in the European context, two significant legislations have taken centre stage: the European Union's Digital Services Act (DSA) and the United Kingdom's Online Safety Act (OSA). With the imminent application of obligations under the DSA effective from the 17th of February 2024 – this blog delves into both these legislations, offering an in-depth exploration and a comparative analysis of some of their key provisions.
We’ll tackle who these laws apply to, and what you can do to begin preparing for these new legislations that present large fines in instances of non-compliance.
Key Takeaways:
Seeking to ensure safe online environments for users, the Digital Services Act (DSA) outlines a comprehensive accountability and transparency regime for a variety of digital services and platforms operating in the European Union (EU). Encompassing a spectrum of digital platforms – from online marketplaces and Internet Service Providers (ISPs) to social media platforms and search engines, the DSA harmonizes the diverging national rules of European Member States that had emerged under the E-Commerce Directive of 2000. This legislation, coupled with the Digital Markets Act and upcoming EU AI Act – seeks to usher in a new regulatory paradigm for the effective governance of digital technologies in the European Single Market.
Taking a staggered approach towards mitigating systemic risks arising from the use of digital platforms, the DSA imposes cumulative obligations (Articles 11 through 48, housed in Chapter III) on intermediary services that fall under the definition of Hosting Services, Online Platforms and Very Large Online Platforms (VLOPs) and Search Engines (VLOSEs). The provisions apply differently depending on the nature of the intermediary service, with VLOPs and VLOSEs – platforms with more than 45 million Monthly Active Users in the EU – subject to the most stringent requirements. The following diagram illustrates the many obligations for covered entities under the DSA.
In addition to complying with general Due Diligence requirements applicable to all intermediary services, the 22 VLOPs and VLOSEs – including the likes of YouTube, Google Search, Instagram, and Amazon – as well as new entrants like PornHub are subjected to a set of special obligations outlined in Articles 34 to 48 of the DSA. These came into effect in late August 2023, in line with the European Commission’s mandate for these to be enforced six months from the Date of VLOP/VLOSE designation.
To fulfil these obligations, the DSA introduces a set of novel tools – ranging from comprehensive risk assessments, allowing users to opt-out from personalized recommendations, mandatory algorithmic explainability and transparency, to even providing vetted researchers access to platform data through API access. Most notable among these requirements, however, is the provision for subjecting platforms to independent audits of their safety and integrity efforts, under Article 37.
The enforcement of the DSA will primarily be carried out by competent authorities at the national level and their corresponding Digital Services Coordinators. However, when it comes to overseeing VLOPs and VLOSEs, the Commission will have exclusive authority to enforce obligations specific to these entities. The Commission will collaborate with Digital Services Coordinators to oversee how these platforms address systemic risks that may arise on their platforms.
Having received Royal Assent on October 26, 2023, the Online Safety Act (OSA) is the United Kingdom’s de-facto legislation on online safety, covering a broad range of risks presented by online platforms. Similar to the DSA in this regulatory scope and aims, the OSA explicitly covers three categories of user-to-user services, which include:
An important aspect of the OSA regime is the “duty of care”, which mandates covered entities to take requisite measures to protect users from harmful content. This aspect essentially moulds the OSA to focus on the underlying trust and safety processes and programs an entity deploys, as opposed to a sole focus on moderating and removing violative content.
Under the OSA, covered entities will be subjected to obligations determined by the extent of their reach and the risks they might present. These translate to 3 broad regulatory categories, namely:
As the primary legislation, the OSA sets the groundwork for the development of targeted secondary legislation aimed at addressing specific issues, such as child safety, violent extremism, and the dissemination of harmful or illegal content.
The legislation grants exclusive regulatory authority to the Office of Communications (Ofcom), which will fulfil this role by issuing dedicated codes of conduct pertaining to various areas of harm in line with primary and secondary legislation that covered entities must adhere to. Additionally, Ofcom will be empowered to carry out investigations and information requests to better understand how entities are addressing these risks, routinely engage with services to supervise their trust and safety efforts, and finally, levy fines and penalties in instances of non-compliance.
Entities subject to regulation must adhere to these codes. Additionally, Ofcom will use its information gathering powers, and conduct investigations to gain insight into how entities are managing these risks, as well as engage with services to oversee their efforts in maintaining trust and safety standards. Finally, Ofcom is empowered to impose fines and penalties in cases of non-compliance.
The global momentum for online safety regulation is gaining speed. Regardless of their size, companies need to implement a strong combination of proactive and reactive strategies. This is essential not only for compliance with regulations but also to reduce instances of harm on their platforms. One effective approach is to subject a company's trust and safety tools and processes to independent audits—a measure increasingly recommended by regulations like both, the DSA and OSA.
As leaders in the emerging fields of AI Assurance and Algorithm Auditing, Holistic AI provides comprehensive and tailored solutions to support affected entities with such compliance obligations. We do so by considering several factors, such as the complexity and novelty of conducting such audits, the need to deploy socio-technical methods to audit certain provisions, and the timeline for the application of obligations, among others.
Particularly for Independent Audits for VLOPs and VLOSEs under the DSA, Holistic AI provides the following services:
To begin exploring the intersection of our trustworthy AI platform and your initiatives, schedule a demo today.
DISCLAIMER: This blog article is for informational purposes only. This blog article is not intended to, and does not, provide legal advice or a legal opinion. It is not a do-it-yourself guide to resolving legal issues or handling litigation. This blog article is not a substitute for experienced legal counsel and does not provide legal advice regarding any situation or employer.
Schedule a call with one of our experts