France: The SREN Law and its Impact on Digital Platforms

Written By

sacha bettach Module
Sacha Bettach

Senior Associate
France

As a senior associate in our Paris IT, commercial and dispute resolution teams, and a member of the Paris Bar, I advise our clients on both contentious and non-contentious matters.

alexandre vuchot module
Alexandre Vuchot

Partner
France

I'm a partner in our international Commercial group, based in Paris, where I provide our clients with strategic commercial advice.

The SREN Law (Sécurité et Régulation de l'Espace Numérique - Security and Regulation of the Digital Space) represents a major legislative improvement in the regulation of digital platforms in France. It notably adapts French law so that the Digital Services Act (DSA) and the Digital Markets Act (DMA) can be applied. These two European texts impose new obligations on digital giants. The SREN Law adapts French law accordingly and designates the various regulatory authorities responsible for enforcing these regulations in France.

The law aims to strengthen the security of online users, combat illegal content and establish greater transparency in the practices of the major digital platforms. 

Context and Objectives of the SREN Law

Faced with an increase in harmful content online, such as disinformation, hate speech and scams, the law aims to make the digital space safer and is part of this approach by imposing new obligations on online platforms, particularly social networks, search engines and online marketplaces.

The main objectives of the law are to:

  1. Improve user safety: by forcing platforms to introduce stricter moderation measures,
  2. Increasing transparency: by requiring platforms to publish detailed reports on their moderation practices and content management,
  3. Combat illegal content: by imposing shorter deadlines for the removal of reported illegal content.

Main Provisions of the SREN Law for Platforms

1. Moderation Obligations

Digital platforms are now required to implement effective moderation systems to detect and remove illegal content. This includes content such as online hate, child pornography, terrorism and scams. Major platforms must also recruit and train personnel dedicated to moderation.

2. Transparency Reports

Platforms must publish annual reports detailing their moderation efforts, the number of items removed, the moderation criteria used, and the action taken against repeat offenders. These reports must be accessible to the public, allowing for better understanding and evaluation of platform practices.

This must align with the DSA provisions, which require platforms to ensure the identification of sellers using their platform and the completeness of pre-contractual information related to the products for sale. This includes details like price, delivery times, payment methods, and information on labelling or branding. They also have obligations regarding fairness, including a ban on using "dark patterns" (manipulative digital techniques). 

3. Response time

Platforms are now obliged to remove certain types of illegal content within 24 hours of it being reported. For terrorist content, this period is reduced to one hour. Failure to comply with these deadlines can result in severe fines.

This must also align with the DSA, which requires that online sales platforms must carry out random automated checks on adverts to ensure that they do not correspond to products that have already been reported as illegal. They must also inform all consumers who have purchased a non-compliant or dangerous product in the last six months.

4. Coordination with the Authorities

The SREN Law encourages cooperation between digital platforms and the judicial and administrative authorities. Platforms must report serious illegal content to the competent authorities and provide the information needed to identify the authors of such content.

In addition, the DSA provides for very large platforms (with more than 45 million active users in the EU) to be subject to enhanced obligations: mitigation of the systemic risks posed by their platforms, independent audit, and access for one year to all ads served, so that they can be analysed.

Sanctions

The DGCCRF is responsible for overseeing the implementation of the regulation by marketplaces established in France.

The penalties provided for by law are two years’ imprisonment and a fine of up to 6% of worldwide turnover.

The DGCCRF will ensure that these provisions of the DSA are properly implemented in France, in close cooperation with ARCOM and CNIL, which have also been designated as supervisory authorities in their areas of competence, with ARCOM playing a coordinating role. There will also be close coordination with the European Commission and the other Member States, which are responsible respectively for supervising very large platforms and platforms established outside France (but which may address the French market), to ensure uniform application of the texts at European level.

Latest insights

More Insights
featured image

Guiding through ‘the maze of food labelling’ – The most recent European Court of Auditors’ special report

6 minutes Dec 20 2024

Read More
flower

NEWSFLASH - The UK’s New Consultation on AI and Copyright: Purr-suing Balance?

Dec 19 2024

Read More
laptop phone

EU/UK sanctions regarding Russia and Belarus (16-12-2024)

Dec 19 2024

Read More