Key Takeaways
The Digital Services Act (DSA) is an EU law that regulates digital services. It requires companies to assess risks, outline mitigation efforts, and undergo third-party audits for compliance. The DSA is part of the EU's approach to regulating digital technologies, along with the Digital Markets Act (DMA) and EU AI Act.
The Act is a lengthy (300 pages) and horizontal (cross-sector) piece of legislation with composite rules and legal obligations for technology companies. Notably, there is a focus on social media, user-oriented communities, and online services with an advertising-driven business model.
One of the central goals of the Digital Services Act is to put an end to the self-regulation of tech companies and force companies to be more transparent, particularly in the realm of algorithmic accountability and content moderation. To do so, the DSA includes clear responsibilities for the EU and member states to enforce these rules and obligations. The law will be in full effect starting February 17th, 2024.
The Digital Services Act applies to host services, marketplaces, and online platforms that offer services in the EU, regardless of their place of establishment. Therefore, the effect of the Act and the expectation to comply will be felt globally.
There is a specific focus on Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs). VLOPs have been defined as platforms that have over 45 million average monthly users in the EU.
A risk governance approach underpins the EU Digital Services Act. It pertains to regulated responsibilities to address systemic issues such as disinformation, hoaxes and manipulation during pandemics, harm to vulnerable groups and other emerging societal harms. These issues are categorised as online harm/harmful content in the legislation and are governed by Articles 26, 27 and 28.
Referred to as the risk assessments provision, Article 26 stipulates that VLOPs and VLOSEs will need to conduct risk assessments on an annual basis or at the time of deploying a new relevant functionality to identify any systemic risks coming from the design and provision of services.
These assessments must identify risks related to all fundamental rights outlined by the charter, focusing on risks to freedom of expression, electoral processes and civic discourse, the protection of minors, public health and sexual violence. As the technology harms landscape shifts and evolves, risks may evolve too. Ensuring agile risk assessments is critical.
As stipulated by Article 27, these risk assessments must be accompanied by mitigation measures that are reasonable, proportionate and effective. Efforts to mitigate the risks associated with harmful content should bear in mind that harmful content should be treated in the same way as illegal content to the extent that harmful content is not illegal. The DSA’s rules will only impose measures to remove or encourage the removal of illegal content in full respect to the freedom of expression.
Article 28 will require that VLOPs submit annual independent and third-party audits to certify that they comply with Articles 26, 27 and overall reporting requirements. In addition, the audits would ensure that VLOPs comply with Ch. III of the DSA, the third-party auditor, would have to prove independence from the industry for an audit to be considered valid.
Companies should also note that vetted researchers, including academics and civil society organizations, could gain access to relevant data to conduct their own research surrounding risk identification and mitigation.
The DSA will be enforced through National Authorities and the EU Commission, where national authorities must assign a competent authority to supervise and enforce. For VLOPs and VLOSEs, the EU Commission will be the enforcement body.
The Digital Services Act came into effect on November 16th, 2022, and digital service providers now have three months to publish their number of active users. As per Article 93, the new rules will become applicable from February 17, 2024. VLOPs and VLOSEs would have to be ready to comply four months earlier.
Taking precautionary steps is the best way to get ahead of the Digital Services Act and other global AI regulations. The Holistic AI Governanceplatform, combined with a team of AI experts who, informed by relevant policy, can help you manage the risks of your AI systems and processes. Reach out to us at we@holisticai.com to learn more about how we can help you embrace AI confidently.
DISCLAIMER: This blog article is for informational purposes only. This blog article is not intended to, and does not, provide legal advice or a legal opinion. It is not a do-it-yourself guide to resolving legal issues or handling litigation. This blog article is not a substitute for experienced legal counsel and does not provide legal advice regarding any situation or employer.
Schedule a call with one of our experts