Navigating the nexus of Policy, Digital Technologies, and Futures (S1/E6)
S1/E6: The Digital Services Act and the new notions of VLOP and VLOSE – Part 1
Welcome to a new episode or our series, which explains selected European Union (EU) policies and laws that were recent proposed/applied, and which impact the Software Industry! Today I’m going to explore the very recent Digital Services Act, or DSA.
The DSA is another comprehensive regulatory framework, created to address concerns about user safety, privacy, and fair competition in the digital sphere. It was proposed by the European Commission (EC) in December 2020, in order to update and strengthen the legal framework governing digital services within the EU.
The Act encompasses a wide range of digital services, including online platforms, social media networks, online marketplaces, and search engines. It focuses on enhancing user protection, promoting competition, and addressing illegal content and harmful online practices, like disinformation.
Key objectives and provisions of the DSA include the following.
Platform Responsibility: The DSA introduces a new concept called "intermediary services," which includes online platforms that facilitate the sharing of user-generated content. Establishing a completely new regulatory environment, these platforms will be required to take responsibility for the content they host and implement measures to prevent the dissemination of illegal content and harmful behaviour, such as hate speech, terrorist propaganda, disinformation, and counterfeit products.
Obligations for the largest platforms to rein in “systemic risks”: EU lawmakers consider that the largest platforms pose the greatest potential risks to society, including negative effects on fundamental rights, civic discourse and elections, gender-based violence, and public health. That’s why the DSA will obligate platforms with over 45 million users in the EU, e.g. YouTube, TikTok, and Instagram, to formally assess how their products, including algorithmic systems, may exacerbate these risks to society and to take measurable steps to prevent them.
Enhanced User Rights: The Act emphasizes user rights and transparency. Online platforms will be obligated to provide users with clear information regarding the terms and conditions, content moderation policies, and data processing practices. Users will have greater control over their data and the ability to contest decisions made by platforms regarding content removal or account suspension. In addition, the DSA forbids designated service providers from implementing any major design change without conducting a prior risk assessment. For instance, Twitter’s erratic behaviour following Musk’s take-over would have been forbidden, had the DSA been in application.
Market Competition: The DSA aims to address the dominance of large online platforms by introducing stricter rules for very large online platforms (here, the VLOP!) and of very large online search engines (and the VLOSE, which isn’t the opposite of VWIN…). These platforms, which have significant market power, will face additional obligations, including ensuring fair and transparent practices, facilitating interoperability, and sharing certain data with competing businesses.
Enforcement and Oversight: Enforcement will be coordinated between national and EU-level bodies. The Commission will have direct supervision and enforcement powers over the VLOP and the VLOSE, and can impose fines of up to 6% of their global turnover. But the Act also establishes a Digital Services Coordinator (DSC) within each EU member state to oversee compliance with the new rules. The DSC will be an independent regulator responsible for enforcing the rules on smaller platforms established in its country. They will work closely with the European Commission and other DSCs to ensure effective implementation and enforcement of the DSA provisions.
Legally-mandated data access for external scrutiny: Platforms’ self-assessments and risk-mitigation efforts will be open to scrutiny. For this, platforms will be forced to share their internal data with independent auditors, EU and Member State authorities, as well as researchers from academia and civil society, who will be in a position to verify such claims and identify systemic risks, holding platforms accountable.
I think I’ll stop this episode here. Otherwise, it’d become quite long and you could lose your interest in this Act, what would be exactly the opposite outcome at which I’m aiming. But don’t despair. Part 2 will come real soon, giving you a glimpse of the next stations in this legislative train, as well as my opinion on potential benefits and implementation challenges. Just watch this space!
[This blog series is inspired by research work that is or was partially supported by the European research projects CyberSec4Europe (H2020 GA 830929), LeADS (H2020 GA 956562), and DUCA (Horizon Europe GA 101086308), and the CNRS International Research Network EU-CHECK.]
CNRS - France
Digital Skippers Europe (DS-Europe)