Skip to content

Insights/·10 min read

AI Act vs GDPR: What Compliance Officers Need to Know

If you already run a GDPR program, the AI Act is not a rewrite — but it's not a free pass either. A side-by-side for DPOs who need to know exactly where their existing program ends.

By Mehmet Köse · Founder, Synapsrix

If you already run a mature GDPR programme, you have reusable governance muscle: records of processing, DPIAs, vendor reviews, and incident response. The AI Act does not replace that programme — but it is not “GDPR with different letterhead.” It is a product and lifecycle regime for AI systems, with obligations that attach even when no personal data is processed.

This article is written for DPOs, privacy counsel, and compliance officers who need a precise map of overlap, tension, and net-new work, with citations to GDPR Articles 30, 35, and 22 and AI Act Articles 9, 14, 26, 27, 50, 72, and 73. Where EDPB guidance will still move the line on edge cases, we say so plainly.

Two regulations, one compliance program?

You can govern AI inside your existing privacy and information-security forums — one steering group, one inventory — but you must not collapse the legal tests. GDPR asks whether processing is lawful and fair. The AI Act asks whether an AI system is high-risk, limited-risk, or prohibited, then applies Chapter III requirements that mirror product safety thinking: risk management (Article 9), data governance (Article 10), technical documentation (Article 11 / Annex IV), logging (Article 19), transparency to deployers (Article 13), human oversight (Article 14), and accuracy/robustness (Article 15).

If you “bolt on” AI Act only as a DPIA annex, you will miss non-personal-data risk and provider duties that never appeared in GDPR records.

Keyword reality: teams searching “AI Act GDPR” or “AI Act compliance officer” usually need a role description update more than a new policy PDF. The compliance officer often owns vendor risk; the DPO owns data rights. High-risk AI forces joint sign-off on go-live when Article 27 FRIAs and Article 35 DPIAs must align.

Where they overlap

High-risk AI systems that process personal data trigger both frameworks. Practical overlap points:

  • Lawfulness and purpose limitation (GDPR Article 5) intersect with data governance for training and operation (Article 10) when personal data is in play. If your lawful basis is legitimate interests, document how Article 9 risk controls interact with your balancing test.

  • Data minimisation (Article 5(1)(c)) aligns with training-data choices under Article 10 — but you still need standalone reasoning where no GDPR processing occurs.

  • Transparency: GDPR Articles 13–14 privacy notices align with Article 13 AI Act instructions for deployers and Article 50 transparency for certain AI interactions.

  • DPIA (GDPR Article 35) and FRIA (Article 27) can be combined where conditions match — the AI Act explicitly contemplates alignment for deployers subject to Article 27.

  • Human review of automated decisions: GDPR Article 22 rights and Article 26 deployer oversight reinforce each other where decisions produce legal or similarly significant effects. Layer Article 86 explanation duties where applicable.

  • Security: GDPR Article 32 measures support Article 15 robustness and cybersecurity for high-risk systems — but they are not identical tests.

  • Processors and providers: GDPR Article 28 should be read with Article 25 value-chain rules — your vendor may owe conformity artefacts you must verify in procurement.

Where the AI Act goes further than GDPR

GDPR does not mandate Annex IV-style technical documentation for every model. The AI Act does for providers of high-risk systems. Similarly:

  • Risk management system (Article 9) is continuous and system-specific; it is not reducible to a one-time DPIA.

  • Post-market monitoring for providers (Article 72) and serious incident reporting (Article 73) create AI-specific timelines distinct from GDPR breach notification (Articles 33–34). A personal-data breach may be neither necessary nor sufficient for an Article 73 serious incident — run both playbooks where facts overlap.

  • Conformity assessment and CE marking paths (Articles 40–48) have no GDPR analogue for software products.

  • Registration in the EU database (Article 49 / Article 71) is an AI Act artefact, not a GDPR register entry.

  • Prohibited practices (Article 5) are not covered by GDPR legality analysis — you cannot “lawfully” process personal data through a banned system.

  • Fundamental rights impact assessment (Article 27) targets AI-specific harms beyond privacy — discrimination risks, labour rights, access to services — even when your DPIA already addressed data risks.

A Venn-style mental model

Think of GDPR as protecting people through data rules. Think of the AI Act as protecting people and society through system rules. The overlap is largest when high-risk systems process personal data for Annex III purposes — for example, recruitment or credit. The overlap is smallest when critical infrastructure AI processes telemetry without identifiable persons — GDPR may be thin, AI Act still bites.

Where GDPR still covers things the AI Act does not

GDPR’s retention minimisation, subject access, erasure, and DPO appointment rules apply to personal data whether or not the AI Act cares about your model. Conversely, an AI system can be high-risk while processing no personal data — for example, certain critical infrastructure analytics — where GDPR is not your lead regime.

The AI Act does not appoint an “AI DPO” for private companies. Roles are implied through obligations, not a mandated title.

Controllers, processors, providers — keep the map legible

GDPR cares about controller vs processor. The AI Act cares about provider, deployer, importer, distributor. A cloud company selling HR AI may be processor under GDPR to your organisation and provider under the AI Act for the same system. Your DPA must not silently assign conformity duties you cannot perform.

When you fine-tune a model on your data, you may shift toward provider-like responsibilities for that modified system — legal review matters before you let a department “just tweak” weights on employee data.

Data subject rights vs AI Act transparency

Access requests under GDPR Articles 15–20 may force disclosure of personal data in logs and outputs. Article 26(11) requires information that a high-risk system is used; Article 86 addresses explanations of individual decision-making for certain outputs. Your operational response must route privacy tickets and AI governance tickets together when a complaint references both discrimination concerns and data access.

International transfers vs model hosting

Chapter V GDPR still governs personal data leaving the EEA. The AI Act does not replace SCCs or TIA work for training pipelines. If your provider hosts inference in third countries, you still need transfer analysisand you still need their Article 13 instructions for correct deployment.

The DPIA ↔ FRIA relationship

Article 27 requires a Fundamental Rights Impact Assessment for listed deployers before deploying certain high-risk systems. The text pushes you to use Article 27(6) to rely on an existing DPIA where conditions are met, updating it as needed.

Operationally, maintain one dossier with a cover page that states: which GDPR Article 35 triggers fired, which Article 27 bullets you address, and where AI Act Article 13 instructions were incorporated. Avoid two binders that contradict each other six months later.

EDPB–Commission interplay will refine how much overlap satisfies both; track Article 27(8) consultation duties for national authorities and, where relevant, equality bodies.

What your existing GDPR documentation gets you

Asset you likely already haveHow far it gets you on the AI Act
Records of processing (GDPR Article 30)Roughly half of an AI inventory for systems touching personal data — not enough for non-personal high-risk uses
DPIAs (Article 35)Strong on rights impacts; must extend to risk management evidence for high-risk AI where Article 9 applies
Vendor DPAs & SCCsUseful for data flows, not for conformity artefacts unless you attach AI schedules
SOC 2 / ISO 27001 evidenceHelps Article 15 security and parts of logging controls — not a substitute for Annex IV
Incident responseStart for Article 73 serious incidents, but timelines and triggers differ from GDPR 72-hour breach rules

A 10-dimension comparison

DimensionGDPR anchorAI Act anchorNotes
Scope triggerPersonal dataAI system risk classAI Act can apply without personal data
Core risk toolDPIA (Art 35)Risk management (Art 9) + FRIA (Art 27) for deployersCombine where possible
TransparencyArts 13–14Arts 13, 50, deployer info dutiesChatbots: Article 50 adds UX duties
Automated decisionsArt 22Human oversight Arts 14 & 26Different tests; align UX
DocumentationRoPA Art 30Annex IV (providers)Distinct structure
SecurityArt 32Art 15Overlap but not identical
Incident reportingArts 33–34Art 73 (+ Art 72 monitoring context)Different clocks and triggers
AccountabilityController obligationsProvider/deployer dutiesMap roles along Article 25 chain
Rights of individualsAccess/erasure/etc.Art 26(11) information; Art 86 explanationArt 86 sits in Chapter IX
International transfersChapter VNot a GDPR replacementTransfer tools still matter for data

This table is a working model, not a court brief — EDPB guidance may adjust how strictly independence requirements apply when DPIAs and FRIAs are merged.

What you still need to build

Net-new elements for many GDPR-mature SMEs:

  1. Classification standard for Annex III vs Article 50 vs minimal risk.
  2. Model and integration inventory including non-personal systems in critical contexts.
  3. Deployer evidence for Article 26 (oversight assignment, monitoring, logs).
  4. Provider dependency map for Article 72/73 touchpoints when incidents chain through vendors.
  5. Change-management hooks when foundation models update monthly.

What GDPR gives you “for free” (and what it does not)

Your RoPA fields — business purpose, data categories, recipients — accelerate Article 26(4) input-data review if you already mapped which HR fields feed which model. If your RoPA is a generic “Microsoft 365” entry, you still need granularity per AI feature.

Your DPIA library gives policy language on necessity and proportionality. It rarely contains conformity artefacts for Annex IV unless you already bought AI-specific annexes — update rather than reuse blindly.

Scope out your gap

Run the scanner, then walk deployer obligations in the Article 26 checklist. If your organisation is heavy on HR analytics or credit decisions, schedule legal review early — the overlap table above is not a substitute for fact-specific classification.

EDPB watchlist

Expect future EDPB opinions on FRIA/DPIA convergence and on Article 22 interactions with Article 26(11) transparency. Subscribe to updates; do not hard-code today’s interpretation into irreversible procurement clauses.

Working with supervisory authorities

Under GDPR, you are used to lead supervisory authority concepts. Under the AI Act, market surveillance authorities and national competent authorities play parallel roles. For Dutch organisations, expect AP involvement where personal data intersects AI supervision — but do not assume every AI Act question is a GDPR question. Your regulatory engagement strategy should name which regime a request implicates before you answer informally.

Keep evidence discipline: a GDPR audit asks for RoPA and retention tables. An AI Act inspection may ask for classification memos, instructions for use, logs, and post-market tickets. If you respond to the first with privacy artefacts only, you will look organised but non-compliant under the second regime.

Practical programme tips

  • Single inventory: store GDPR processing ID, AI system ID, and vendor ID as foreign keys — spreadsheets that drift apart become a liability in Article 26 audits.
  • Training: teach privacy champions the Annex III headings; teach security the Article 73 incident definition.
  • Board packs: separate privacy KPIs from AI risk KPIs while showing join points on high-risk HR systems.