Key Takeaways
The Digital Services Act (DSA) is an EU law that regulates digital services. It requires companies to assess risks, outline mitigation efforts, and undergo third-party audits for compliance.
The AI Act is a lengthy (300 pages) and horizontal (cross-sector) piece of legislation with composite rules and legal obligations for technology companies. Notably, there is a focus on social media, user-oriented communities, and online services with an advertising-driven business model.
One of the central goals of the Digital Services Act is to force companies to be more transparent, particularly in the realm of algorithmic accountability and content moderation. To do so, the DSA includes clear responsibilities for the EU and member states to enforce these rules and obligations.
The Digital Services Act applies to host services, marketplaces, and online platforms that offer services in the EU. Therefore, the effect will be felt globally.There is a specific focus on Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs). VLOPs have been defined as platforms that have over 45 million average monthly users in the EU.
Transparency and algorithmic accountability
A risk governance approach underpins the EU Digital Services Act. It pertains to regulated responsibilities to address systemic issues such as disinformation, hoaxes and manipulation during pandemics, harm to vulnerable groups and other emerging societal harms. These issues are categorised as online harm/harmful content in the legislation and are governed by Articles 26, 27 and 28.
Referred to as the risk assessments provision, Article 26 stipulates that VLOPs and VLOSEs will need to conduct risk assessments on an annual basis or at the time of deploying a new relevant functionality to identify any systemic risks coming from the design and provision of services.
These assessments must identify risks related to all fundamental rights. As the technology harms landscape shifts and evolves, risks may evolve too. Ensuring agile risk assessments is critical.
As stipulated by Article 27, these risk assessments must be accompanied by mitigation measures that are reasonable, proportionate and effective (in full respect to the freedom of expression).
Article 28 will require that VLOPs submit annual independent and third-party audits to certify that they comply with Articles 26, 27 and overall reporting requirements. In addition, the audits would ensure that VLOPs comply with Ch. III of the DSA, the third-party auditor, would have to prove independence from the industry for an audit to be considered valid.
The enforcement of the Digital Services ActThe DSA has enforced since February 17, 2024.The DSA will be enforced through National Authorities and the EU Commission, where national authorities must assign a competent authority to supervise and enforce. For VLOPs and VLOSEs, the EU Commission will be the enforcement body.
Holistic AI successfully conducted the world’s first independent DSA audit of Wikipedia, meeting the legislation's rigorous standards. This milestone demonstrates Holistic AI’s commitment to fostering safe, transparent, and compliant online platforms.
Schedule a call with us to explore how our DSA solutions can support your compliance efforts.
DISCLAIMER: This blog article is for informational purposes only. This blog article is not intended to, and does not, provide legal advice or a legal opinion. It is not a do-it-yourself guide to resolving legal issues or handling litigation. This blog article is not a substitute for experienced legal counsel and does not provide legal advice regarding any situation or employer.
Schedule a call with one of our experts