With the European Commission now several months into the implementation and enforcement of the EU Digital Services Act (DSA), a new Commissioner Henna Virkkunen ready to take over the DSA baton from Thierry Breton, the Online Safety Act (OSA) celebrating its one-year anniversary since Royal Assent, and Ofcom halfway through its series of consultations on the OSA duties, now is an opportune time to consider what has happened so far in the lives of the DSA and OSA.
At the time of writing, the European Commission has designated twenty-three Very Large Online Platforms (VLOPs) (spanning social media, video sharing platforms, online marketplaces, and app stores) and two Very Large Online Search Engines (VLOSEs) (Google Search and Bing). Several VLOPs have challenged their designations: Amazon and Zalando, with the latter arguing its business focuses on the dissemination of its own retail content in addition to user-generated content, and several adult entertainment platforms on the grounds of disputed user numbers and the publicly-accessible ad repository requirement. However, these challenges have not yet been heard by the CJEU and Amazon’s application for interim measures to temporarily suspend some of its obligations as a VLOP has been unsuccessful following an appeal by the European Commission.
Also, at the date of this article, since the European Commission started implementing and enforcing the DSA:
The majority of the twenty-seven EU member states have now designated their “Digital Services Coordinators” (DSCs) (national authorities responsible for overseeing and enforcing the DSA on a national level) – see the European Commission’s list here;
DSCs have recognised several “Trusted Flaggers” (entities officially recognised by DSCs as being experts in flagging particular types of content on online platforms) – see the European Commission’s list here; and
DSCs have designated several out-of-court dispute settlement bodies (private entities to which the users of online platforms can bring content moderation issues in order to resolve disputes with online platforms) – see the European Commission’s list here.
OSA – wait and see…
Ofcom published its advice to the UK Government on the thresholds that need to be passed for the purposes of categorisation under the OSA (see here). The previous UK Secretary of State for Science, Innovation and Technology, Michelle Donelan, responded to Ofcom’s research and requested more information on how Ofcom set its user number thresholds for categorisations (see here). Since the new Labour government came into power, there have been further exchanges with the new Secretary of State, Peter Kyle, relating to the treatment of “small but risky” platforms (see here and here), and the disorder in the aftermath of the Southport attack (see here). The latter letter is particularly notable in setting out how Ofcom thinks it would have approached enforcement had the OSA’s illegal harms duties been in place at the time.
Regulatory guidance & tools
DSA – centralised guidance and tools, but more to come from Member States
Tools
Under the DSA, providers of hosting services must inform their users of any content moderation decisions they make and the reasons behind those decisions via “statements of reasons”. For online platforms (which are a subset of hosting services), those statements of reasons must also be submitted to the European Commission. To facilitate this submission, the Commission has launched a “DSA Transparency Database”, whose Application Programming Interface facilitates the submission of these reports and allows the Commission to monitor content moderation decisions in (almost) real-time.
The Commission has also launched a DSA whistleblower tool that helps employees and other insiders report the allegedly harmful practices of VLOPs and VLOSEs.
Regulatory guidance
The European Commission can introduce implementing acts (to create uniform conditions for the implementation of the DSA) and delegated acts (to amend/supplement non-essential elements of the DSA).
So far, the Commission’s implementing acts include:
an implementing act on procedural rules for the Commission’s use of its enforcement powers and the conduct of DSA-related proceedings (see here);
an implementing act on the information sharing system between the DSCs, the Commission and the European Board for Digital Services (see here); and
a draft implementing act on transparency reporting that, notably, contains prescriptive formats/templates for the presentation of services’ transparency information (see here) (the final version of which is due for publication soon).
So far, the Commission’s delegated acts include:
a delegated act on supervisory fees that VLOPs and VLOSEs must pay the Commission (see here);
a draft delegated act detailing the circumstances when vetted researchers may gain access to data held by VLOPs and VLOSEs (and the purposes for which the accessed data may be used) (see here); and
a draft delegated act on the rules for the procedures, methods and templates used for the audits of VLOPs and VLOSEs (see here).
The Commission has also issued (among other things):
a call for evidence for its upcoming guidance on the Art 28 protection of minors obligation (see here), which will impact all online platforms within the DSA’s scope (and not just VLOPs);
Q&A guidance on the requirement for intermediaries to publish their user numbers (see here) – this is important for all online platforms (and not just VLOPs), since online platforms must publish their monthly active users every six months (however, there are a number of positions in this guidance which organisations have found challenging);
guidelines for VLOPs and VLOSEs on how they should mitigate systemic online risks to democratic elections (see here);
an outline of the approach to the enforcement of the DSA’s requirements that the Commission will adopt (see here); and
press releases on the administrative agreements it has entered into with national media regulators acting as DSCs to help it enforce the DSA (see here and here).
National DSCs have also started to issue their own DSA regulatory guidance. For more information, please speak to the relevant member of your Bird & Bird team in the relevant jurisdiction.
OSA – Lots and lots and lots to read… but still in draft for now
Regulatory guidance
Ofcom’s codes of practice will be key to services navigating their safety duties under the OSA – compliance with the measures set out in the codes will allow regulated services to demonstrate compliance with their overarching duties, whereas alternative methods will require careful evidence and explanation. At the time of writing, Ofcom’s illegal harms and children’s consultations are now closed and we await the issuance of Ofcom’s finalised codes.
Illegal harms consultation
Ofcom’s consultation on the protection of people from illegal harms online (see here) closed in February 2024, with the finalised version of the guidance and codes expected to be published in December 2024. Some of the chapters we would recommend reading in particular include:
the summaries of each chapter of the published consultation (see here);
the draft risk assessment guidance (Annex 5 of the consultation, which also contains guidance on the different risk profiles) (see here) – the finalised risk assessment guidance will be crucial for businesses conducting risk assessments as their first step towards OSA compliance, and businesses launching new products may want to start their risk assessments now (based on the draft guidance currently available);
the draft illegal content codes of practice for user-to-user services and search services (Annexes 7 and 8 of the consultation) (see here and here) – these codes contain Ofcom’s recommended safety measures;
the draft illegal content judgments guidance (Annex 10 of the consultation) (see here); and
the summary of Ofcom’s information-gathering, enforcement powers and approach to supervision (see here).
Children’s consultation
Ofcom’s consultation on the protection of children from harms online (see here) closed in July 2024, with the finalised version of the guidance and codes expected to be published in April 2025 (although the final children’s access assessment guidance will come earlier, likely in January 2025). Some of the chapters we would recommend reading in particular include:
the draft children’s access assessment guidance (Annex 5 of the consultation) (see here) – note that there is a children’s access assessment requirement for all in-scope services;
the draft children’s risk assessment guidance (Annex 6 of the consultation – which also contains guidance on different risk profiles) (see here);
the draft children’s safety codes of practice for user-to-user services and search services (Annexes 7 and 8 of the consultation) (see here and here) – these codes contain Ofcom’s recommended children’s safety measures; and
the draft guidance on what “highly effective age assurance” entails (Annex 10 of the consultation) (see here).
Other guidance & consultations
Ofcom has also:
consulted on draft statutory transparency reporting guidance for providers of “categorised services” (see here);
published guidance and research on the potential harms that can be caused by deepfakes and AI systems, as well as guidance on what technology firms could do to reduce those harms (see here);
conducted a consultation and published guidance for service providers publishing adult entertainment content (see here);
conducted a consultation and published its three-year media literacy strategy (see here and here);
conducted a consultation on strengthening its draft codes by specifying animal cruelty and human torture as types of content that platforms must tackle (see here);
released a new consultation on the fees and penalties regime under the OSA (see here);
issued a call for evidence on researchers’ access to information from online services to study online safety matters (see here);
outlined its research agenda, plan of work (for 2024-2025), and implementation roadmap (see here, here and here); and
published joint statements with the ICO (the UK’s data protection regulator) on “ways of working” when it comes to information sharing and collaboration in areas of mutual interest (see, for example, here).
Other updates of note from the UK Government include:
the new Data (Use and Access) Bill having its first reading in the House of Lords – with its draft text granting the Secretary of State the power to introduce regulations (after consultation) to create a system to allow researchers to access data related to online safety matters (see here and here);
the Department for Science, Innovation and Technology publishing an updated impact assessment of the OSA (see here) and an analysis by Revealing Reality of the impact of the OSA (see here);
a press release indicating an intention to make the updated offence of the non-consensual sharing of intimate images a “priority offence” under the OSA (see here);
a consultation by the previous UK Government (as is required of the Secretary of State under the OSA) and publication of a summary of responses on a proposed introduction of a “super-complaints regime” under the OSA (see here); and
a press release noting the establishment of the first UK-US online safety agreement (see here).
Enforcement
DSA – already underway (especially for VLOPs/VLOSEs)
sent requests for information in relation to all but two of the designated VLOPs/VLOSEs (Wikipedia and XNXX), one of which has led to LinkedIn’s announcement that it will no longer be targeting EU users with ads based on their participation in LinkedIn groups (see here);
initiated enforcement proceedings against AliExpress, Facebook, Instagram, TikTok and X, in relation to potential non-compliance with a broad variety of DSA provisions spanning content moderation, complaints handling, ad transparency, recommender system, minor safety and dark pattern obligations;
required TikTok to make binding commitments to permanently withdraw its “TikTok Lite” rewards programme from the EU (see here); and
informed X of its preliminary view that X has breached the DSA’s dark patterns, advertising transparency and data access provisions – X is now able to exercise its right of defence and the European Board for Digital Services must also be consulted.
The Commission is in discussions with Telegram regarding whether it reaches the threshold of becoming a VLOP under the DSA (however, contrary to some reporting, the recent arrest of the Telegram CEO Pavel Durov in France was not in relation to an alleged breach of the DSA).
Enforcement trends amongst national DSCs (including against smaller or infrastructure-layer, mere conduit, caching, and hosting services) will emerge in the coming years. The Irish DSC (Coimisiún na Meán) has started sending requests for information to both VLOP and non-VLOP platforms (see here).
The DSA also permits private enforcement, and disputes have already been initiated in Member States, including the Netherlands and Germany.
OSA – first information notices sent, but most duties yet to kick in
The core safety duties for services caught by the OSA will only start applying after the relevant statutory codes of practice come into force (after a period of public consultation and approval by the UK Parliament).
However, Ofcom’s information gathering powers for the purposes of the OSA are already in force. TikTok has received a fine of £1.875 million (see here) for failing to comply with its duties to provide information in response to a formal notice, which is a clear signal from Ofcom to other organisations in this respect (however, please note that, since TikTok was subject to Ofcom’s original VSP regime under the Communications Act 2003, this particular fine was imposed under that Act rather than under the OSA).
A number of our lawyers also practice in closely-aligned areas, such as privacy, consumer and telecommunications law, and are well-placed to bring cross-practice insights.