This article complements a separate summary of provisions in the Data Act, primarily relating to IoT devices and cloud switching (see here). We have also published a summary of proposed changes to the AI Act (see here).
The main GDPR changes proposed by the European Commission are considered below:
A subjective approach to the definition of ‘personal data’ in need of further clarifications
The fact that another entity may be able to identify data subjects wouldn't make the data personal in the hands of the current holder. The relevant question would be whether it's reasonably likely that this particular entity can identify the data subject using all means likely to be used. The fact that a potential subsequent recipient of disclosed personal data may later be able to identify data subjects does not, in itself, make the data personal for the current holder.
This is welcome but clarification would help. For example, would a controller still need to put in place a data processing agreement to meet Art 28 GDPR requirements with a processor when the data is not personal so far as the processor is concerned? Would a controller still need to put in place appropriate safeguards with a recipient in a third country under Chapter V of the GDPR, if the data is not personal so far as the recipient is concerned?
The European Commission would be empowered to issue subsequent implementing acts, to specify means and criteria to determine whether data resulting from pseudonymisation no longer constitutes personal data for certain entities.
Firmer "compatibility" presumption in favour of scientific research purposes etc.
Article 5(1)(b) GDPR currently states that processing for archiving purposes in the public interest, scientific or historical research purposes or statistical purposes is “not [an] incompatible” purpose. The double negative is turned into a positive statement, which is clearer – so long as conditions in Article 89(1) GDPR are met (ie anonymise or pseudonymise if possible). There is no need to go on to consider the tests for compatible processing in Article 6(4) GDPR.
A broader definition of ‘scientific research’ is added
At present, the concept of ‘scientific research’ is not defined and limited to references in the Recitals, which include a reference to privately funded research, as well as studies carried out in the public interest. A proper definition would be added to Article 4 GDPR (i.e. “scientific research” means any research which can also support innovation, such as technological development and demonstration. These actions shall contribute to existing scientific knowledge or apply existing knowledge in novel ways, be carried out with the aim of contributing to the growth of society´s general knowledge and wellbeing and adhere to ethical standards in the relevant research area. This does not exclude that the research may also aim to further a commercial interest).
It could be useful to clarify how Recital 159 should be read alongside this.
Two more exemptions under Article 9 of the GDPR for special category data
There would be two new derogations from the prohibition on processing special category data:
Processing of personal data to train AI models is a legitimate interest
A new provision would confirm that processing personal data for model training is a legitimate interest. It remains necessary to demonstrate that the processing is justified and to conduct a balancing test (i.e. 2 of the controller’s 3 part Legitimate Interest Assessment (LIA) remain to be done).
Between the leaked draft versions and the final proposal, a possibility for EU or national laws to override the above and require consent for such processing activities has been added. Rather than enabling harmonization, this last-minute addition risks entrenching further fragmentation. It will be interesting to see what happens to it during the trilogue phase.
Taking the sting out of DSARs used in an abusive way
Where individuals abuse data subject access requests for purposes other than protecting their data, the controller would be able to either reject the DSAR or charge a reasonable fee for handling it. This may possibly help with DSARs used as a form of discovery in employment proceedings.
Consistent with current rules, the controller would bear the burden of demonstrating that the request is manifestly unfounded or that there are reasonable grounds to believe that it is excessive.
Lighter transparency requirements under Article 13 of the GDPR in specific circumstances
Controllers need to provide extensive information to individuals about the processing of their personal data. Article 13 applies where controllers directly collect personal data from individuals. The European Commission’s proposal includes two new elements:
On the topic of transparency, in addition to the requirements provided by Articles 13 and 14 of the GDPR, certain Member States have added or maintained supplemental elements to be addressed in privacy notices. For instance, in France, privacy notices must indicate the existence of a right for data subjects to give instructions concerning the use and disclosure of their personal data after their death. The simplification process initiated by the European Commission could be the opportunity to put an end to this tendency (at this moment the proposal does not address this point).
Minor flexing of automated individual decision-making rules
Entirely automated individual decisions with legal or similarly significant effect can be undertaken, such as when necessary for a contract. An amendment makes clear that this provision can still apply even if the same decision could technically be made manually.
One breach portal to rule them all
There is a proposal for a single reporting point for all incidents, following the "report once, share many" principle. This would address requirements under the GDPR, Network and Information Security Directive, Digital Operations Resilience Act and the eventual Critical Entities Resilience Directive. The reporting obligation for communications service providers under the ePrivacy Directive would be repealed on the basis that it would be superfluous.
The threshold for reporting personal data breaches to data protection authorities would be raised — only incidents posing a high risk to data subjects would be reported. The period for reporting to authorities would be increased to 96 hours and a single incident reporting form would be introduced.
The European Data Protection Board (“EDPB”) would be tasked to draft proposal for a common template for notifying a personal data breach as well as a list of the circumstances in which a personal data breach is likely to result in a high risk to the rights and freedoms of an individual. The common template and the “list” would have to be reviewed at least every three years and updated where necessary.
Unfortunately, the European Commission’s proposal does not deal with personal data breach reporting obligations of processors. They currently remain obliged to report all personal data breaches (irrespective of their risk level) to their controllers (see Article 33(2) of the GDPR); a change here would be welcome.
Consistent approach to DPIAs across the EU
The EDPB would be required to develop lists setting out when data protection impact assessments are required. These lists would be valid across the EU and would replace the current myriad of national white and black lists issued by national data protection authorities (in this spirit their power on this topic under Articles 57.1(k) and 64.1(a) would be deleted). The EDPB would also be commissioned to develop a common EU-wide template and methodology for conducting a DPIA.
The draft tackles the persistent challenges of consent fatigue and cookie banner proliferation through fundamental reforms to the ePrivacy Directive. The key mechanism is porting relevant provisions from Article 5(3) into GDPR as a new Article 88a, bringing personal data processing from terminal equipment fully within GDPR's regulatory framework, whilst preserving existing ePrivacy protections for non-personal data.
Controllers face new operational requirements: they must provide single-click refusal mechanisms and are prohibited from repeating consent requests for the same purpose within six months of refusal or during any granted consent's validity period. Two purposes gain specific exemptions from consent requirements - creating aggregated audience measurement for the controller's own use, and maintaining security. Personal data obtained under these exemptions can be processed using the full range of GDPR lawful bases, including legitimate interests, though controllers must carefully weigh factors including whether data subjects are children, reasonable expectations, and processing scale, ensuring processing doesn't constitute continuous monitoring of private life.
The most significant, although longer-term, innovation comes through Article 88b's automated consent mechanism. Controllers must enable data subjects to give or refuse consent through machine-readable signals, whilst web browser providers (excluding SMEs) must provide technical means for users to transmit these preferences. Implementation timelines reflect technical complexity: controllers have 24 months, browser providers 48 months - a substantial extension from earlier drafts. The Commission will task European standardisation organisations with developing necessary protocols, with conformity to harmonised standards creating a presumption of compliance. Media service providers (as defined in the European Media Freedom Act) are exempted from respecting these automated signals, acknowledging advertising revenue's role in sustaining independent journalism.
Insights: practical implications for the adtech ecosystem
These changes present both opportunities and challenges for the digital advertising industry. The automated consent mechanism could finally deliver on the promise of reducing cookie banner friction - but only if industry stakeholders actively participate in standards development. The 24-month controller implementation window is tight for complex adtech infrastructures, particularly where consent management platforms must integrate with browser signals across diverse technical environments. Ultimately, success will depend not only on technical implementation, but on how the automated choice mechanism is presented to users.
The exemptions create interesting strategic possibilities. The audience measurement exemption provides a consent-free foundation for basic analytics, potentially reducing reliance on third-party measurement vendors. However, subsequent processing provisions require careful analysis. Additional clarity on lawful bases is helpful, but enforcement challenges remain - particularly whether regulators will accept legitimate interests for targeted advertising use cases.
That all said, whilst the Digital Omnibus simplification measures offer genuine benefits across sectors, adtech practitioners should be cautious about assuming equivalent relief. The broader context of the EU’s Data Union Strategy reveals persistent regulatory concerns about data-intensive advertising practices. The Commission’s explicit focus on “data brokerage” - characterised as collecting, aggregating, and trading personal data without individuals’ awareness or meaningful consent - demonstrates continued scepticism towards certain data commercialisation models. Whilst data brokerage represents only one component of the adtech ecosystem, regulatory rhetoric often conflates these distinct practices, risking a broader brush approach to enforcement.
This matters because simplification in regulatory frameworks does not necessarily translate to simplification in enforcement outcomes. Although the Digital Omnibus provides clarity on lawful bases and consent mechanisms, where adtech controllers face subjective assessments (for example, around legitimate interests balancing tests and what constitutes “reasonable expectations”), regulatory track record suggests these subjective elements will consistently be interpreted against any use in an adtech context (often out of principle). The Commission’s call for “strengthened enforcement” and assessment of “additional safeguards” indicates that procedural clarity may simply provide regulators with more effective tools for rigorous oversight. For adtech practitioners, harmonisation should be welcomed for reducing compliance complexity, but this may offer limited protection against the underlying regulatory hostility that has characterised EU data protection enforcement in this sector for the past decade.
The Digital Omnibus envisions a consolidated regime for re-use of public sector information, moving most provisions of the Open Data Directive and of the Data Governance Act to new chapters of the Data Act. Two sets of rules would apply, one for data and documents falling under the current Open Data Directive, and one for data that is protected under confidentiality, intellectual property or personal data protection rules (as per the current Data Governance Act). This would give businesses an EU-wide right to re-use data and documents that qualify as open data, removing fragmentation in the way Member States decide to grant access and impose licenses or restrictions. The definition of available data and documents, as well as excluded categories, would remain the same. Where businesses ask to re-use data falling outside the open data categories, the other set of rules would apply. This replicates the provisions of the Data Governance Act that currently apply to public sector bodies.
The conditions for re-use would also be largely unaffected, with for instance a ban on discriminatory clauses and exclusivity arrangements, and a duty of transparency on pricing, etc. There is, however, one significant change: both for open data and for protected data, very large enterprises, and in particular gatekeepers (as designated under the Digital Markets Act (DMA)), could be imposed more stringent license terms, and be charged higher fees, all in consideration of their ‘economic power’ or their ‘ability to acquire data’, in a proportionate manner and subject to objective criteria, with a view to covering costs and obtaining a ‘reasonable return on investment’. This will likely attract criticism.
To simplify the rules on data intermediaries, the Digital Omnibus proposes to replace mandatory notification with a voluntary registration (this would still give data intermediaries the right to use the EU logo and call themselves “recognised in the Union”). Many of the stringent requirements, such as the need for a separate legal entity, would be deleted, and intermediaries would have greater flexibility to provide additional services. Value-added services, however, would need to be offered through a ‘functionally separate entity’. Importantly, entities designated as a gatekeeper under the DMA would not have the right to offer such value-added services. (The proposal also clarifies other important aspects of the activity of data intermediaries, with respect to pseudonymisation and anonymisation of personal data, and in relation to transfer of non-personal data outside the European Union).
These proposals will now be sent to the European Parliament and the Council (comprising representatives of the 27 EU Member States) for review, amendment and adoption. The Commission is already looking at further simplification initiatives and is undertaking a ‘stress-test’ of the digital rulebook, via a Digital Fitness Check. Stakeholders are invited to provide their views by 11 March 2026.
The overall legislative process is outlined below.
Legal legislative process:
Late November–December 2025: Late November–December 2025: The European Commission will send its proposals for the Digital Omnibus package to the European Parliament, which will allocate the package to the relevant committee(s). The Committee on the Internal Market and Consumer Protection (IMCO), the Committee on Industry, Research and Energy (ITRE) and the Committee on Civil Liberties, Justice and Home Affairs (LIBE) are likely to play leading roles. Political groups will then nominate a rapporteur and shadow rapporteurs to draft the Parliament’s report in the relevant Committees;
From January 2026: Members of the European Parliament (MEPs) will discuss and table amendments to the proposals during Committee meetings, with the aim of adopting on a final Report by Q1 2026;
Council discussions: In parallel, representatives of the 27 EU Member States in the Council will begin technical talks to prepare their position, known as a ‘general approach’, expected in Q1 2026;
By Q2/Q3 2026: Once the responsible European Parliament Committee(s) adopts their final report and this Report is endorsed by a vote by all MEPs in plenary session, and the Council agrees on its approach, interinstitutional “trilogue” negotiations between the European Commission, European Parliament, and Council will begin—likely in spring 2026—to reach a compromise text on the proposal;
Possible acceleration: This timeline could accelerate if the European Parliament decides to apply its urgent procedure under Rule 170 of its Rules of Procedure, as it has done for previous omnibus packages. This would allow the proposal to bypass the full Committee stage and go directly to a vote during a Parliament plenary session, potentially enabling adoption as early as Q1 2026. This fast-track option reduces opportunities for amendments and shortens stakeholder engagement windows;
Final adoption: Under the standard process, adoption is expected by mid-2026 or Q3 2026, but this will depend on how aligned the Parliament and Council are with the Commission’s initial proposals. This timeline could take longer if the Parliament and Council have significant amendments to the original proposals, or proceed more quickly if the urgent procedure is triggered.
If you have any queries regarding the Digital Omnibus package, please get in touch with our article contributors.