CJEU decision on algorithmic transparency and secret protection (CJEU C-203/22)

Written By

edoardo barbera Module
Edoardo Barbera

Partner
Italy

My practice is intellectual property, particularly the protection and exploitation of innovation and information.

adriano dottavio Module
Adriano D'Ottavio

Counsel
Italy

I am a lawyer with a strong passion for new technology. My goal is to provide practical solutions to complex issues.

On 27 February 2025, the Court of Justice of the European Union (“CJEU” or “Court”) issued a crucial judgment in Case C-203/22, addressing the delicate balance between the right of individuals to transparency in automated decision-making processes and secret protection (“D&B Decision”). The ruling is part of the debate raised by the decision of 7 December 2023, Case C-634/21 ("Schufa Decision") on the application of the General Data Protection Regulation 2016/679 ("GDPR") to algorithmic evaluations, particularly in the credit scoring sector.

The context of the dispute

The issue originates from the request of an individual (“CK”) who was refused the conclusion of a telephone contract due to a negative assessment of his creditworthiness carried out by Dun & Bradstreet Austria GmbH (“D&B”) - a company operating in the business information sector that relies on algorithmic systems to process the creditworthiness of natural persons.

Based on the complaint lodged by CK, the Austrian Data Protection Authority ordered D&B to provide the individual with details on the functioning of these decision-making processes, enforcing the right set forth in Article 15(1)(h) of the GDPR, which requires data controllers to provide “meaningful information about the logic involved” in automated decisions.

The company brought an action against the decision before the Austrian Federal Administrative Court, claiming that, due to a protected secret, it was not required to disclose to CK any additional information to the one already provided. The administrative court ruled that D&B had infringed Article 15(1)(h) of the GDPR by failing to provide CK with meaningful information about the logic involved in the automated decision-making process or, at the very least, by failing to adequately justify its inability to do so. However, the enforcement of this decision was rejected on the grounds that D&B had sufficiently complied with its information obligation.

Therefore, the case was submitted by the Vienna Administrative Court to the CJEU to clarify the delicate balance between the right to transparency and the protection of sensitive information like secrets.

The conclusions of the CJEU

In its judgment, the CJEU ruled as follows:

  • Right to transparency: individuals have the right to receive intelligible information about the logic involved in automated decision-making processes, including key parameters and their influence in the assessment. The data controller could not provide a complex mathematical formula or generic information only; rather, it must ensure that the data subject can understand the functioning of the mechanism underlying the automated decision-making process used in the specific case.
  • Limits to the protection of secret: although the GDPR recognises that the right to the protection of personal data is not an absolute right and must be balanced against other fundamental rights (Recital 4 of the GDPR), and that the data subject's right of access to personal data should not adversely affect the rights and freedoms of others, including trade secrets (Recital 63 of the GDPR), these considerations do not justify a refusal to grant the data subject access to the requested information. Therefore, secret protection may not be invoked in absolute terms to deny access to relevant information to the data subject.
  • Necessary balancing: in the event that the data controller considers that the information to be provided to the data subject in accordance with Article 15(1)(h) GDPR contains third party data protected by that regulation or trade secrets, within the meaning of Article 2(1) of Directive 2016/943 (EU), the data controller must disclose that information to the competent supervisory authority or court, which must balance the rights and interests at issue with a view to determining the extent of the data subject’s right of access.

Lights and shadows between the right to an explanation and secret protection

The CJEU decision raises several questions and potential critical issues for companies using automated decision-making systems and, in particular, for those operating in the creditworthiness assessment area.

The distinction between scoring and automated decision-making

One of the most relevant issues concerns the applicability of Article 22 of the GDPR to companies operating in the credit scoring sector.

With regard to the case at stake, D&B did not make a decision against CK directly, but merely provided a credit score, which was subsequently used by a telco operator to decide whether or not to enter into a contract with the individual.

This raises a fundamental question: can the score provider be held responsible for an “automated decision” within the meaning of Article 22 GDPR, or is the real decision made by the third party using this information? This is a fundamental distinction, especially considering that companies (such as D&B) operating in the credit scoring business do not directly determine whether an individual can have access to a specific service (such as a phone contract or a loan), but they process data that is later interpreted by other companies that are, de facto, in a position to exercise decision-making power over the individual.

In the context of the Schufa Decision, which occurred shortly before the judgment at issue, the CJEU stated that automated scoring of an individual's creditworthiness, where it plays a decisive role in the establishment, implementation or termination of a contractual relationship, is already an automated decision-making process, which consequently falls within the scope of Article 22 of the GDPR[1] . In other words, the CJEU would thus seem to affirm that the scoring activity carried out by a company such as Schufa or D&B would fall under the application of Article 22 whenever the decision made by a third party is decisively influenced by the prior scoring activity, thereby denying, de facto, a general and stable qualification of that activity, the configuration of which will depend, on a case-by-case basis, on the actual use of the scoring by economic operators.

Despite all the legal implications raised by the affirmation of such a principle, especially at a national level, it is clear that the CJEU does not provide a definitive answer to this “dilemma”.

The boundary between transparency and secret

The judgment reiterates that the protection of personal data must be balanced against other fundamental rights.

After all, the D&B Decision shows that a full and detailed disclosure of a secret is not necessarily required when exercising the right of access: “first, the controller should find simple ways to tell the data subject about the rationale behind, or the criteria relied on in reaching the automated decision. Second, the GDPR requires the controller to provide meaningful information about the logic involved in that decision, but « not necessarily a complex explanation of the algorithms used or disclosure of the full algorithm».” (Paragraph 60). The judgment also provides an example of “sufficient disclosure” which confirms the possibility of fulfilling the duty to provide access without communicating all the relevant information: “As regards, specifically, profiling such as that at issue in the main proceedings, the referring court could, inter alia, find that it is sufficiently transparent and intelligible to inform the data subject of the extent to which a variation in the personal data taken into account would have led to a different result.'" (Paragraph 62).

However, the Court also noted that a restriction of the right to an explanation is possible only when it is necessary and proportionate to safeguard other rights.

In particular, the Court clarified that in the event of a conflict between exercising the right of access and the protection of secret, a balance will have to be struck between those rights. To that end, the relevant information - personal data of third parties and/or secrets - must be disclosed to the competent supervisory authority or court, which must balance the rights and interests at issue, with a view to determining the extent of the right of access.

As to how this assessment is to be carried out, the CJEU states that the supervisory authority or the court will assess on a case-by-case basis whether the requested information may be disclosed and to what extent, but it does not set out the criteria to be applied for this purpose.

Although this solution offers protection for both interests at stake, it introduces the risk of a slower and more complex handling of access requests, increasing costs and time for companies, especially in sectors highly dependent on algorithmic models.

 The issue of verifying the accuracy of the data

[1] «Article 22(1) of Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) must be interpreted as meaning that the automated establishment, by a credit information agency, of a probability value based on personal data relating to a person and concerning his or her ability to meet payment commitments in the future constitutes ‘automated individual decision-making’ within the meaning of that provision, where a third party, to which that probability value is transmitted, draws strongly on that probability value to establish, implement or terminate a contractual relationship with that person.».Another crucial aspect concerns the possibility for the data subject to verify the accuracy of the information provided by the data controller.

The Court clarified that the right of access to personal data under Article 15(1)(h) of the GDPR entails the provision of meaningful information about the logic involved in automated decision-making, but not necessarily the full disclosure of the algorithm or technical details of the scoring model. However, the Court emphasised that the information provided must be sufficiently clear and intelligible to enable the data subject to assess the accuracy and fairness of the processing of his or her data, as well as to exercise any rights to rectification or object if he or she considers the decision-making process to be incorrect.

Implications for business

In the light of the D&B Decision, companies using automated decision-making systems, particularly those in the credit scoring business, are thus required to review their policies for disclosing information to users, ensuring that:

  • Sufficient, clear and intelligible explanations are provided to data subjects, from the first contact, in order to avoid an escalation that may lead to the disclosure of their secrets to a supervisory authority or court.
  • A procedure is implemented to verify data accuracy in case of disputes.
  • Mechanisms are in place for identifying and disclosing relevant information to the supervisory authorities or the courts in cases of conflict between the right of access and secrecy.

Conclusions

The CJEU judgment C-203/22 represents an important step forward in the configuration and consequent regulation of automated decisions in the EU context, but there are still various unresolved questions of interpretation.

The main issue concerns the distinction between those who carry out profiling and/or assessment activities on an individual and those who, instead, make decisions based on such activities. Such a topic could be further developed in case law and, at the same time, could in fact require various legislative actions to fill the gaps of an express authorisation at a local level which, pursuant to Article 22(2)(b) of the GDPR, also specifies the appropriate measures to protect the rights, freedoms and legitimate interests of the individuals on which the automated decision is based.

Companies operating in the credit rating sector will also need to adopt a more transparent and structured approach, without compromising the protection of their secrets. In the absence of more detailed guidelines from the competent authorities, the risk of litigation remains high, thus making necessary a careful compliance strategy that can properly balance information transparency obligations with the right of economic operators to protect their secrets.

Finally, the judgment under consideration certainly enhances the protection of individuals' rights with respect to automated decisions, forcing companies to find a necessary balance between transparency towards individuals and the protection of their secrets. Nevertheless, while emphasising the need to provide individuals with “meaningful information”, the CJEU does not provide detailed criteria as to what information is sufficiently “meaningful”, especially when the explanation involves complex algorithms.

Latest insights

More Insights
Curiosity line pink background

Italian Rules on AI as a supplement to the AI Act

4 minutes Apr 29 2025

Read More
Curiosity line green background

German Bundesnetzagentur provides decision to extend mobile spectrum subject to conditions

3 minutes Apr 29 2025

Read More
Curiosity line teal background

EU Commission opens consultation on revising Cybersecurity Act

2 minutes Apr 29 2025

Read More