The European Union (EU) has long been a global leader in establishing robust data privacy laws, creating what many refer to as the “Brussels Effect”—a phenomenon where EU regulations influence global standards. For instance, GDPR inspired similar legislation in over 120 countries, demonstrating the EU’s far-reaching impact on international data privacy norms.
With the GDPR setting a high bar for data protection in 2018, the EU continues to shape the future of privacy governance, particularly in the face of burgeoning artificial intelligence (AI) technologies.
This article explores how the GDPR and recent EU laws like the AI Act and Digital Operational Resilience Act (DORA) are advancing the need for comprehensive data governance and privacy, what’s next for AI and data processing, and how to incorporate these developments into your 2025 privacy roadmap.
GDPR and the AI Act: Raising the stakes for data privacy
Since its enforcement in 2018, the GDPR has been the gold standard for data privacy. Its transparency, accountability, and individual rights principles have set a benchmark for global privacy laws. However, the rapid evolution of AI technologies has prompted the EU to establish the AI Act, which went into force in August 2024. This act aims to regulate AI systems based on their risk to individuals’ fundamental rights, health, and safety.
The AI Act employs a tiered, risk-based approach, prohibiting certain high-risk applications like social scoring and real-time biometric identification in public spaces. For high-risk AI systems, the act mandates:
- Risk management systems
- Transparency measures
- Data governance practices
- Human oversight mechanisms
Organizations deploying AI must align these requirements with GDPR obligations, creating a dual compliance framework that demands robust data protection measures and clear documentation of AI system processes.
AI governance: What’s next?
The AI Act introduces timelines for phased compliance, with most provisions taking effect by August 2026. Notable upcoming requirements include:
- AI literacy initiatives to ensure users and developers understand AI risks and benefits.
- Codes of Practice for General Purpose AI (GPAI) to be finalized by May 2025.
- Governance structures for systemic-risk AI models, emphasizing testing, risk assessments, and adversarial evaluations.
Additionally, the EU is exploring supplemental rules to harmonize procedural aspects of the GDPR, potentially improving cross-border enforcement and cooperation among data protection authorities (DPAs).
7 Steps to AI Compliance
Stay ahead of evolving AI regulations with our 7-step roadmap to responsible AI compliance.
Download the infographicGovernance in the AI Era
Master the balance between innovation and risk to build a privacy-centric, ethical AI framework.
Access the ebookNew frontiers in data governance: The EU Data Act, DORA, and NIS2
The EU Data Act
Effective September 12, 2025, the EU Data Act introduces new rules for data access, sharing, and portability, particularly for connected devices and the Internet of Things (IoT). Unlike the GDPR, which focuses on personal data, the Data Act encompasses both personal and non-personal data, fostering innovation while addressing business-to-business (B2B) and business-to-government (B2G) data sharing.
Key obligations under the Data Act include:
- Providing users access to their generated data: This includes both personal and non-personal data, as well as metadata produced by connected devices, ensuring individuals can retrieve and manage their data.
- Ensuring data portability between service providers: Companies must facilitate seamless data transfers, enabling users to switch providers without data loss or excessive delays.
- Establishing safeguards for intellectual property and trade secrets: Organizations are required to implement protections that balance data accessibility with the need to secure proprietary information and sensitive business details.
The Digital Operational Resilience Act (DORA) and NIS2 Directive
Effective January 17, 2025, DORA targets the financial sector by creating a comprehensive information and communication technology (ICT) risk management framework. Alongside DORA, the NIS2 Directive introduces stringent cybersecurity requirements for essential entities across sectors like energy, healthcare, and transport, significantly broadening the EU’s cybersecurity landscape. It emphasizes:
- Incident reporting within 24 hours of identification.
- Regular resilience testing to assess readiness.
- Stringent third-party risk management.
Failure to comply with DORA or the NIS2 Directive can result in substantial penalties. For example, non-compliance with DORA can result in fines of up to 10 million euros or 2% of annual global turnover, underscoring the financial implications of non-compliance. The NIS2 Directive mandates strict incident reporting within 24 hours and imposes penalties proportionate to the gravity of the cybersecurity breaches, further emphasizing the need for robust frameworks.
Insights from recent papers and opinions
The Hamburg Commissioner’s paper on Large Language Models and Personal Data
This paper highlights a crucial distinction: while large language models (LLMs) process personal data during training, storing such models does not necessarily constitute ongoing data processing under GDPR. This interpretation underscores the need for organizations to demonstrate accountability in training and deploying AI systems.
EDPB Opinion 28/2024 on Processing Personal Data in the Context of AI Models
The European Data Protection Board (EDPB) emphasizes rigorous evaluation of AI systems trained on personal data. To demonstrate compliance, organizations must document every step, including Data Protection Impact Assessments (DPIAs).
CIPL: The Limitations of Consent as a Legal Basis for Data Processing in the Digital Society
The evolving digital landscape challenges the scalability of consent as a lawful basis for data processing. Recent discussions from the Center for Information Policy Leadership (CIPL) suggest that legitimate interest, with safeguards like opt-outs, may offer a more practical alternative for training AI models.
Watch as privacy experts discuss these papers in Data Privacy in the EU: What You Need to Know.
Building your data privacy 2025 roadmap
To remain compliant and competitive, privacy and compliance professionals must proactively adapt to the EU’s evolving legal landscape. Here are critical steps to include in your 2025 roadmap:
1. Enhance data mapping and scoping
While data mapping has been a cornerstone of GDPR compliance, organizations must expand their efforts to include metadata and information generated by AI and connected devices. Identify high-risk AI applications and map their data flows to ensure compliance with GDPR and the AI Act.
Revisit your data inventories to include non-personal data covered under the Data Act. The Data Act’s requirements for data portability and access add layers of complexity to traditional data governance.
2. Strengthen AI governance
Develop and implement policies for AI risk management, transparency, and accountability. Include provisions for human oversight and ethical considerations in AI deployment.
3. Update policies and contracts
Review and update your privacy policies, data-sharing agreements, and third-party contracts to reflect new obligations under the Data Act and DORA.
4. Invest in training
Train your teams on AI literacy and emerging regulatory requirements. Ensure all employees understand their roles in maintaining compliance and mitigating risks.
5. Prepare for regulatory changes
Monitor updates from EU institutions, such as the European Data Protection Board (EDPB), the EU Commission, and individual DPAs. Stay informed about new procedural rules for GDPR enforcement and guidance on AI compliance.
The “Brussels Effect”: A call to action
The EU’s legislative agenda underscores its commitment to safeguarding individual rights while fostering innovation in a digital age. For businesses operating in or engaging with the EU, this means embracing a proactive, governance-driven approach to privacy and AI compliance.
Incorporating the GDPR, AI Act, Data Act, and DORA into your 2025 strategy will help you navigate the complexities of European data privacy laws. This proactive approach ensures compliance and builds a resilient, future-ready organization.
The EU’s regulatory framework may seem like uncharted space, but with the right tools and mindset, you can boldly go where no compliance program has gone before.
Data Mapping & Risk Manager
Gain complete visibility and control of your data and accurately identify and mitigate risks.
Start automated data flow mappingAutomate Compliance, Enhance Trust
Automate consent and data subject rights compliance. Design seamless privacy experiences to enhance customer trust across your digital landscape.