On 16 February 2024 the ICO published guidance on the application of data protection law to content moderation activities (the "Guidance"), following a call for views last year. The topic is increasingly under regulatory scrutiny in the context of evolving online safety regulations, including the UK's Online Safety Act ("OSA") which gained Royal Assent on 26 October last year; though the Guidance also applies beyond the scope of activities required by the OSA. We have summarised key takeaways for businesses from the Guidance in this article.
Overall, the Guidance is helpful in clarifying the ICO's pragmatic position as compared to the EU in some areas. However, some gaps and questions remain (which we have highlighted below.
The Guidance is firstly relevant for businesses using content moderation to comply with their duties under the OSA. The Guidance focuses on moderation on user-to-user ("U2U") services which allow user-generated content ("UGC", defined in the Guidance by reference to the OSA) to be uploaded/generated and encountered by other users on their service. The ICO defines content moderation (which is not defined in the OSA) as "the analysis of user-generated content to assess whether it meets certain standards; and any action a service takes as a result of this analysis."
U2U services' OSA obligations that require content moderation include duties to use proportionate measures to prevent users from encountering priority (the most serious) illegal content (which may require an element of proactivity depending on the service's risk assessment), and proportionate systems and processes to swiftly remove illegal content after becoming aware of its presence. Further duties (including preventative duties) relating to legal but harmful content apply to U2U services likely to be accessed by children, and the largest/riskiest "Category 1" U2U services.
Ofcom closed its consultation on illegal harms on 23 February 2024, so only a short window after the ICO's guidance was published. Under the OSA U2U services have a "cross-cutting duty" to have particular regard to privacy, and the Ofcom consultation refers to this throughout (though at a high level). U2U services should therefore have regard to the Guidance when complying with the OSA's cross-cutting duty, in addition to their own data protection duties under the UK GDPR. Otherwise however the Guidance refers U2U services to Ofcom's guidance for compliance with their OSA duties.
The Guidance is also applicable to businesses who are using content moderation other than in connection with OSA duties. This will be relevant for example to businesses not directly subject to OSA but nevertheless conducting moderation (for example hosting services without a U2U element, or U2U services falling into one of the OSA's exemptions); or businesses subject to OSA conducting additional moderation e.g. in connection with enforcement of their Terms of Service ("ToS"). These types of moderation will often be a commercial decision or in response to public relations concerns and/or stakeholder pressure.
Lastly, the Guidance also applies to organisations providing content moderation services – this will often be as a processor though the ICO has flagged that determination of processing roles in this area can be complex. This may relate to companies providing AI content moderation technologies or outsourced human moderation, for example.
The ICO confirms that where the UGC itself is processed in a content moderation context, this is likely to be personal data because it is either obviously about someone, or is connected to someone making them identifiable (e.g. the user whose account posted the UGC).
Other personal data may also be processed however, for example contextual information used to make content moderation decisions like age, location and previous activity. This is connected to s.192 OSA which says organisations should assess "reasonably available" information when determining whether a piece of content is illegal. Businesses will need to comply with data protection law regarding this information too – regardless of whether it has been separated from the information about the user account during the moderation process, since this will only render it pseudonymised rather than anonymised (though is a helpful security measure).
Furthermore, once an action has been decided on, the ICO confirms this "generated" information will also constitute the relevant user's personal data.
The Guidance issues recommendations for compliance with the data protection principles when using content moderation as a controller.
The ICO says the "most likely" legal bases under Article 6 UK GDPR will be legal obligation or legitimate interests.
Contractual necessity may be possible where necessary to enforce ToS but the ICO says legal obligation/LI are likely to be more suitable; and the fact children may not have capacity to enter into the relevant ToS must be considered. The ICO also says it is not possible to rely on contract where processing is for service improvement of content moderation systems . This is in line with the EU's increasingly restrictive interpretation of contractual necessary.
Businesses must assess whether special category data ("SCD") may be processed. Given the nature of UGC that might be moderated (for example indicating political affiliation, or users' own eating disorder or mental health-related medical conditions) we think this is likely to be relevant for many U2U services.
The Guidance says SCD will be processed where UGC directly expresses SCD, or where SCD is intentionally inferred from the UGC. It will not however be processed where a moderator "may be able to surmise" SCD is present but does not have a purpose of inferring this, nor intend to treat people differently based on such inference. This is consistent with the ICO's narrower interpretation of SCD as compared to the EU.
The ICO suggests the most appropriate Article 9 condition for processing will be substantial public interest, citing the Part 2 Schedule 1 Data Protection Act 2018 ("DPA") conditions of (i) preventing or detecting unlawful acts, (ii) safeguarding of children and individuals at risk, or (iii) regulatory requirements (the latter applicable where establishing whether someone has committed an unlawful act or has been involved in dishonesty, malpractice or other seriously improper conduct). Businesses should note that all of these require an "appropriate policy document" to be in place (though not for (i) where disclosing data to the relevant authorities).
However we believe there is still a gap against the following OSA obligations:
The ICO has not dealt with this gap so this remains an area of uncertainty.
Similarly, businesses must assess whether criminal offence information will be processed which also requires an authorisation in domestic law. This includes information "relating to" criminal convictions or offences, which the ICO confirms can include suspicions or allegations of offences.
Organisations may have a argument in some circumstances that information does not reach this threshold, such as where they are enforcing based on breach of ToS rather than making illegal content judgements; there are other scenarios however where the threshold seems particularly likely to be reached, such as where child sexual abuse material ("CSAM") is reported to the NCA. Assessment should be on a case-by-case basis.
The ICO refers to the same conditions for authorisation under domestic law as with SCD, satisfaction of which (in particular "prevention or detection of unlawful acts") seems more likely in this context.
Many businesses will outsource moderation to third party technology providers, some of whom will deploy AI. The ICO says that:
The Guidance outlines other considerations where content moderation activities are outsourced – for example, complexities in determining processing roles and the importance of regular review and audit. Where providers are using personal data for "other" purposes such as product development are improvement, the ICO says the main service "should confirm that the provider would be acting as a controller", and consider how information about this processing will be communicated to end users – this will bring up concerns around purpose limitation, and for providers around the extent of their own UK GDPR compliance obligations.
Areas are explicitly excluded from the Guidance including behavioural profiling and reporting CSAM to the National Crime Agency (which the ICO will provide guidance on in the future); or on-device moderation which is covered by the Privacy and Electronic Communications (EC Directive) Regulations 2003 ("PECR").
The Guidance was prepared based on existing data protection legislation in the UK. The UK Government is in the process of updating the regime via the postponed Data Protection and Digital Information ("DPDI") Bill which now has a tentative timeline for Royal Assent of mid-2024, and which may well impact rules around Art 22 automated decision making and will perhaps be able to cover some of the SCD-OSA "gaps" (though it does not currently).
Organisations in-scope of the OSA should also keep an eye on Ofcom's Codes of Practice, with the illegal harms Codes of Practice due to come into force late 2024 and others due to be introduced on a phased basis.
Contact Bird & Bird's privacy and online safety teams if you have any questions (including if you are not sure whether the OSA applies to you).