Articles

FRIA vs DPIA

FRIA vs DPIA

FRIA vs DPIA – How to Conduct AI Impact Assessments Under the EU AI Act

The introduction of the EU AI Act has added a new layer to risk assessments: the Fundamental Rights Impact Assessment (FRIA). For organisations already familiar with the Data Protection Impact Assessment (DPIA) under GDPR, the key challenge is understanding how these two frameworks interact — and how to implement them in practice.

If your organisation deploys high-risk AI systems, you will often need to conduct both a DPIA and a FRIA. While they share a similar structure, their scope and legal focus differ.

This guide explains:

  • When a FRIA is required under the EU AI Act
  • How it differs from a DPIA
  • How to conduct an AI impact assessment step by step

What Is a DPIA Under GDPR?

A Data Protection Impact Assessment (DPIA) is required under Article 35 of GDPR when data processing is likely to result in a high risk to individuals’ rights and freedoms.

Typical triggers include:

  • Large-scale processing of personal data
  • Systematic monitoring
  • Use of sensitive data
  • Automated decision-making

The purpose of a DPIA is to:

  • Identify risks to individuals
  • Assess their severity and likelihood
  • Define mitigation measures

For most organisations, DPIAs are already a standard part of compliance processes.

What Is a FRIA Under the EU AI Act?

A Fundamental Rights Impact Assessment (FRIA) is introduced under Article 27 of the EU AI Act.

It applies to deployers of high-risk AI systems, particularly in areas such as:

  • Employment and recruitment
  • Credit scoring and insurance
  • Education and exams
  • Public services and benefits

Unlike a DPIA, a FRIA goes beyond data protection.

👉 It assesses risks to fundamental rights, including:

  • Non-discrimination
  • Privacy
  • Freedom of expression
  • Access to services

This broader scope is what makes FRIA a central requirement for AI governance.

FRIA vs DPIA: Key Differences

Element DPIA (GDPR Art. 35) FRIA (AI Act Art. 27)
When required Before high-risk personal data processing Before deployment of high-risk AI systems
Scope of risk Risks to data protection and privacy Risks to fundamental rights (broader scope)
Legal basis GDPR (Article 35) EU AI Act (Article 27)
Assessment logic Identify, evaluate, mitigate risks Same structured risk-based approach
Documentation Mandatory documentation required Mandatory documentation required
Review cycle Ongoing review and updates Ongoing monitoring and updates
Regulatory access Provided upon request Provided upon request

When Do You Need a FRIA?

A FRIA is required when:

  • You are a deployer of a high-risk AI system
  • The system falls under Annex III categories
  • The AI system may impact fundamental rights

Common examples include:

  • AI used in hiring decisions
  • Creditworthiness assessments
  • Automated student evaluation
  • AI-assisted healthcare decisions

👉 In practice:
If your system requires a DPIA, it will very often also require a FRIA

How to Conduct a FRIA
(Step-by-Step)

Step 1 — Define the Use Case

Clearly describe what the AI system does, where it is used, and who is affected. This sets the scope of the assessment.

Step 2 — Identify Affected Individuals

Define the categories of individuals affected, the scale of impact, and whether any vulnerable groups are involved.

Step 3 — Map Fundamental Rights Risks

Assess risks such as discrimination, lack of transparency, unfair outcomes, or exclusion from services.

Step 4 — Evaluate Severity and Likelihood

Determine how serious each risk is and how likely it is to occur, using a structured evaluation approach.

Step 5 — Define Mitigation Measures

Implement safeguards such as human oversight, bias testing, transparency measures, and appeal mechanisms.

Step 6 — Document and Approve

Document the risks, mitigation measures, and decision-making process. Ensure internal approval and accountability.

Step 7 — Ongoing Monitoring

Continuously monitor system performance, track incidents, and update the FRIA as the system evolves.

Tags:
case study
gdpr
gutenberg
interesting
Data flow between subjects and partners
PREVIOUS
Records of Processing Activities (RoPA): 9 Things You Need to Know
NEXT
Provider vs Deployer