Seeking to ensure safe online environments for users, the Digital Services Act (DSA) sets out a comprehensive accountability and transparency regime for a variety of digital services and platforms operating in the European Union (EU). Covering a spectrum of digital platforms, the DSA harmonises the diverging national rules of European Member States that had emerged under the E-Commerce Directive of 2000. This legislation, with the Digital Markets Act and upcoming EU AI Act – seeks to usher in a new regulatory paradigm for the effective governance of digital technologies in the European Single Market.
This blog focuses on the Independent Auditing provision for Very Large Online Platforms (VLOPs) and Search Engines (VLOSEs), and how affected entities can ensure compliance with this regulation.
Key Takeaways:
Taking a staggered approach, the DSA imposes cumulative obligations (Articles 11 through 48, housed in Chapter III) on intermediary services that fall under the definition of Hosting Services, Online Platforms and Very Large Online Platforms (VLOPs) and Search Engines (VLOSEs). The provisions apply differently depending on the nature of the intermediary service, with VLOPs and VLOSEs – platforms with more than 45 million Monthly Active Users in the EU -- subject to the most stringent requirements. The following diagram illustrates the many obligations for covered entities under the DSA.
In addition to complying with general Due Diligence requirements applicable to all intermediary services, the 19 VLOPs and VLOSEs – including the likes of YouTube, TikTok, Instagram, Amazon and Zalando, are subjected to a set of special obligations outlined in Articles 34 to 48 of the DSA. These came into effect from late August 2023, in line with the European Commission’s mandate for these to be enforced six months from the Date of VLOP/VLOSE designation, i.e from April 2023.
To fulfil these obligations, the DSA introduces a set of novel tools -- ranging from comprehensive risk assessments, allowing users to opt-out from personalised recommendations, mandatory algorithmic explainability and transparency, to even providing vetted researchers access to platform data through API access. Most notable among these requirements, however, is the provision for subjecting platforms to independent audits of their safety and integrity efforts, under Article 37.
A global-first, Article 37(1) of the DSA mandates VLOPs and VLOSEs (Audited Providers) to commission external auditors (Auditing Organisations) to test and validate their compliance efforts with Due Diligence Commitments under Chapter III of the legislation on an annual basis. These need to be aligned with the yearly cycle of Risk Assessments that need to be performed by VLOPs and VLOSEs under Article 34 of the DSA and be conducted for relevant Codes of Conduct and Crisis Protocols. Further, they need to be performed with a Reasonable Level of Assurance, wherein the auditing organisation should have a “high, but not absolute, level of confidence that there have been no misstatements such as omissions, misrepresentations, or errors, which were not detected in the audit”, and be submitted in the form of an Audit Report to the European Commission.
Given the novelty of this regulatory tool, the European Commission has sought to provide clarity to affected entities by releasing a Delegated Regulation on Conducting Independent Audits in May 2023. These draft delegated rules seek to provide guidance on the procedural modalities involved, and to that end clarify the methodologies, steps and reporting templates that must be implemented for these audits. In particular, the draft rules mention the need to systematically audit algorithmic systems, which have been defined to include advertising systems, recommendation engines, content moderation technologies and other features that may use novel technologies like generative AI and foundation models.
The Delegated Rules clarify the relationship between Audited Providers and Auditing Organisations, and lay down provisions for selecting auditors, as well as mechanisms on data sharing and cooperation between the two. Due to the complex and specific nature of such audits, the draft permits Audited Providers to contract different Auditing Organisations or a consortium of auditors to conduct the same.
The table below describes the important aspects of the delegated regulation:
Pursuant to the Delegated Draft Regulation on Independent Audits under the DSA, Holistic AI provides the following services to covered entities:
Schedule a demo with our experts to find out more.
DISCLAIMER: This blog article is for informational purposes only. This blog article is not intended to, and does not, provide legal advice or a legal opinion. It is not a do-it-yourself guide to resolving legal issues or handling litigation. This blog article is not a substitute for experienced legal counsel and does not provide legal advice regarding any situation or employer.
Schedule a call with one of our experts