
Diana Vorniceanu
10 Mar 2026 / 8 Min Read
Itxaso Domínguez de Olazábal, PhD, Policy Officer at European Digital Rights (EDRi), unpacks the Digital Omnibus and what it means for Europe's data governance framework.
The Digital Omnibus is a legislative package that reopens core elements of the EU’s digital rulebook, including the General Data Protection Regulation (GDPR), the ePrivacy Directive, and the AI Act.
It is presented as simplification. In substance, however, it recalibrates fundamental safeguards that structure how personal data and automated decision-making are regulated in Europe.
For payments and ecommerce businesses, this is not peripheral: payment systems rely heavily on identity verification, fraud analytics, behavioural modelling, credit scoring, biometric authentication, and automated transaction controls. These functions sit directly within the legal zones that the Omnibus seeks to modify.
The proposed changes to the GDPR go beyond procedural simplification. They affect the scope of the Regulation, the structure of lawful bases, safeguards around automated decision-making, protections for sensitive data, and the operation of transparency and access rights.
For companies, the central issue is legal certainty.
The GDPR currently provides a relatively stable architecture: clear definitions, a technology-neutral structure, strict conditions for derogations, and strong transparency mechanisms. This enables businesses to build compliance frameworks that are consistent across member states and across sectors.
Several elements of the Omnibus risk destabilising that foundation.
The proposal changes the way how identifiability is assessed and opens the possibility that certain pseudonymised datasets may fall outside the GDPR for specific actors.
At present, whether data qualify as personal data is assessed objectively and contextually. It does not depend solely on what a particular controller claims to be able to do. This ensures a uniform application of the GDPR across the EU.
However, if identical datasets could be treated as personal data by some actors and as non-personal data by others, the result would be fragmentation. Companies operating in multi-actor ecosystems, such as payments infrastructures with processors, analytics vendors, and fraud service providers, would face inconsistent qualification across the chain.
This reduces predictability in three ways:
Legal certainty at the threshold stage matters. If the applicability of the GDPR itself becomes variable, every downstream safeguard becomes less predictable.
The introduction of a dedicated provision for AI training and operation linked to legitimate interest represents a substantive policy shift.
Under the GDPR, legitimate interest requires a strict three-step test, including an assessment of necessity and balancing against the rights and expectations of individuals. This approach is designed to be contextual and case specific.
Creating a technology-specific provision risks signalling that large-scale data reuse for AI is structurally legitimate rather than exceptional. Even if the formal test remains unchanged, the interpretative environment will shift.
For companies, this creates uncertainty in two directions:
The result is increased enforcement volatility. A lawful basis that is both politically contested and structurally ambiguous does not simplify compliance. It complicates risk assessment.
The changes to Article 22 recalibrate the role of automated decision-making in the GDPR. Currently, automated decisions producing significant effects are treated as inherently high-risk and permitted only under strict conditions. The proposal weakens this structure by reframing the necessity test and contractual justification.
If necessity is assessed by reference to how a service is designed rather than by objective indispensability, automation can become legally justified simply because the business model depends on it.
For sectors such as payments, where credit scoring, fraud blocking, onboarding decisions, and pricing are automated, this apparently creates short-term operational flexibility, but it also introduces long-term uncertainty:
When the boundary between permissible and impermissible automation becomes less clear, litigation risk increases.
The proposal introduces new pathways for retaining sensitive data in AI systems where removal is deemed disproportionate, and it extends derogations for biometric authentication under loosely defined notions of user control. Sensitive data protections are among the strictest elements of the GDPR. Weakening them in AI-heavy contexts creates ambiguity around what is genuinely permitted.
For payment providers integrating biometric authentication or behavioural inference, the risk is not simply regulatory: it is reputational and systemic. If protections depend on vague standards such as ‘disproportionate effort’ or partial notions of control, companies cannot reliably predict how supervisory will interpret them.
Ambiguity at this level undermines investment planning.
The proposal expands the conditions under which data controllers may refuse, delay, or generalise responses to access and information requests.
Transparency rights are not administrative formalities. They are the activation mechanism for enforcement. If access becomes conditional or discretionary, disputes become more complex and more likely to escalate.
From a business perspective, strong and clear transparency rules reduce disputes. Weak and flexible transparency rules increase friction, because individuals are more likely to escalate when they perceive opacity.
Raising breach notification thresholds and expanding exemptions from Data Protection Impact Assessments (DPIA) shift more discretion to controllers.
While reducing over-reporting may seem attractive, higher thresholds also reduce the amount of early regulatory feedback and weaken the signalling function of supervision.
DPIAs, when robust, provide ex ante risk clarity. However, expanding exemptions may reduce the short-term workload but increase ex post liability if risks are underestimated.
Taken individually, some of these changes may appear technical. Taken together, they shift the GDPR away from a preventive, rights-based model towards a more discretionary, controller-driven model.
For companies, discretion is not the same as certainty. Where the law relies more heavily on internal assessments and flexible standards, enforcement becomes less predictable. Supervisory authorities intervene later, and courts play a larger corrective role.
For global payments actors operating across jurisdictions, predictability and uniform application are strategic assets. Legal regimes that weaken structural clarity may reduce immediate constraints but increase long-term compliance volatility.
In financial systems built on trust and interoperability, legal certainty is not a theoretical value. It is operational infrastructure.
The Omnibus also seeks to address what is described as consent fatigue, particularly in relation to cookies and tracking technologies. Cookies and similar tools serve both marketing and security purposes in ecommerce. They support:
The critical question is how consent is expressed and enforced. Privacy signals represent one possible structural shift. Instead of repeated banner interactions, users express consent preferences in a machine-readable format at browser or device level.
From a business perspective, privacy signals can:
They may also reduce the amount of granular behavioural tracking available for targeted advertising.
For payments providers the more important dimension is legal certainty. Consent mechanisms that rely on manipulative design patterns are increasingly subject to scrutiny. Infrastructure-level signalling reduces that exposure.
The strategic choice is between maximising data extraction through interface optimisation and building predictable compliance through standardised signalling.
The AI Omnibus reopens the AI Act shortly after its entry into force, bypassing the planned review process. We and other civil society organisations have criticised both the procedure and the substance of this reopening because it weakens core safeguards. For payments and ecommerce actors, three aspects are particularly relevant.
Under the AI Act, providers whose systems match the criteria of high-risk categories in Annex III can declare that their system does not pose a significant risk, provided that certain conditions are met. A transparency safeguard requires such exemptions to be registered publicly.
The Omnibus proposes removing that safeguard. If providers can self-exempt from high-risk obligations without public registration, oversight becomes opaque.
This reduces market transparency and increases enforcement unpredictability for credit scoring, fraud detection, and biometric systems used in payments. Opacity benefits short-term flexibility. It undermines systemic trust.
The proposal extends lighter compliance obligations beyond traditional SMEs to larger mid-cap enterprises.
Under the AI Act, risk is meant to be tied to use case, not organisational size. Yet, expanding privileges based on company scale weakens that principle.
For payments markets, this creates competitive distortion when similar systems are subject to different compliance burdens based solely on corporate size.
The Omnibus also proposes delaying the implementation of certain AI Act provisions. This does not resolve underlying obligations. It extends legal uncertainty and compresses preparation timelines later. In regulated sectors such as payments, regulatory predictability is more valuable than temporary deferral.
The Digital Fairness Act focuses on manipulative design and exploitative digital practices. Many payments processes involve behavioural nudging, personalised offers, dynamic pricing, and interface optimisation.
When the default setting of the GDPR shifts towards automated decision-making, and when AI oversight transparency is reduced, the significance of these design practices increases.
If automated scoring determines which users see which offers, and interface design shapes data disclosure, then data protection, AI governance, and fairness regulation converge.
Payments businesses should therefore:
Fragmented compliance models will struggle in this environment.
The Digital Omnibus weakens certain structural safeguards within Europe’s digital rulebook. For certain stakeholders, this may appear as reduced compliance friction. However, weakening default protections in foundational law introduces volatility.
Payments infrastructure depends on trust. Trust, in turn, depends on predictable legal architecture. Executives should not approach the Omnibus as a narrow regulatory adjustment. It is a test of governance maturity.
Companies that invest in explainable systems, defensible data practices, and transparent decision logic will adapt more easily to future regulatory recalibration.
Companies that optimise around ambiguity will face compounded risk when enforcement and judicial scrutiny intensify. In financial systems, stability is a competitive advantage.

Itxaso Domínguez de Olazábal, PhD, is a Policy Officer at European Digital Rights (EDRi). She is an expert in data protection and privacy, with a focus on commercial surveillance and the multidimensional virtual harms caused by online tracking. She also specialises in online freedom of expression, examining the role of security forces in content governance and policy making.

European Digital Rights (EDRi) is the biggest European network defending rights and freedoms online.
The Paypers is a global hub for market insights, real-time news, expert interviews, and in-depth analyses and resources across payments, fintech, and the digital economy. We deliver reports, webinars, and commentary on key topics, including regulation, real-time payments, cross-border payments and ecommerce, digital identity, payment innovation and infrastructure, Open Banking, Embedded Finance, crypto, fraud and financial crime prevention, and more – all developed in collaboration with industry experts and leaders.
Current themes
No part of this site can be reproduced without explicit permission of The Paypers (v2.7).
Privacy Policy / Cookie Statement
Copyright