GDPR Article 35 is one of the most misunderstood yet critically important provisions of the General Data Protection Regulation. While many organizations focus their compliance efforts on consent, privacy policies, or breach notifications, Article 35 operates at a deeper and more strategic level. It requires organizations to pause, assess risks, and think ahead before processing personal data in ways that could significantly affect individuals.
Article 35 introduces the concept of a Data Protection Impact Assessment (DPIA). A DPIA is not merely a document or a bureaucratic checkbox. It is a structured risk-assessment process designed to ensure that privacy and data protection are built into systems, products, and business models from the very beginning.
Organizations that ignore this question expose themselves not only to regulatory fines but also to reputational damage, loss of trust, and operational risk.
What GDPR Article 35 Actually Says (Plain English)
At its core, Article 35 requires organizations to carry out a Data Protection Impact Assessment when a type of processing is likely to result in a high risk to the rights and freedoms of natural persons.
This obligation applies before the processing begins, not after something goes wrong.
In simpler terms, if you are planning to:
- process sensitive data,
- use new or intrusive technologies,
- monitor people systematically,
- or process large volumes of personal data,
you must first analyze:
- What data you collect
- Why you collect it
- What risks it creates
- How you reduce those risks
When Is a DPIA Mandatory?
Article 35 does not require a DPIA for every data-processing activity. Instead, it focuses on situations where risk is elevated.
A DPIA is required when processing is likely to result in a high risk to individuals’ rights and freedoms, particularly when using new technologies or innovative data practices.
Common Triggers for a Mandatory DPIA
A DPIA is usually required if your processing involves one or more of the following:
Systematic and extensive profiling
This includes automated decision-making that evaluates personal aspects such as behavior, health, financial status, or location.
Large-scale processing of sensitive data
Examples include health data, biometric identifiers, genetic data, political opinions, or sexual orientation.
Systematic monitoring of publicly accessible areas
For example, large CCTV deployments, facial recognition in public spaces, or smart surveillance systems.
Innovative or intrusive technologies
AI systems, machine learning models, behavioral tracking, or combining datasets in novel ways.
Data processing that affects vulnerable individuals
Children, elderly persons, patients, employees, or economically dependent individuals.
The Concept of “High Risk” Under Article 35
“High risk” does not mean hypothetical or speculative danger. It refers to realistic risks to fundamental rights and freedoms, such as:
- Loss of privacy
- Identity theft
- Discrimination
- Financial harm
- Reputational damage
- Loss of control over personal data
- Psychological or social harm
The more intrusive, opaque, irreversible, or large-scale the processing is, the more likely it qualifies as high risk.
Who Is Responsible for Conducting the DPIA?
The data controller is responsible for ensuring that a DPIA is carried out. This responsibility cannot be delegated entirely to a processor or third party.
However, Article 35 strongly encourages involving:
- The Data Protection Officer (DPO), if one exists
- IT and security teams
- Legal and compliance staff
- Business stakeholders responsible for the processing
If processing is carried out jointly with other controllers, responsibilities for the DPIA should be clearly allocated and documented.
What Must a DPIA Contain? (Article 35(7) Breakdown)
Article 35(7) sets out the minimum content of a DPIA. Each DPIA must include the following elements.
1. Description of the Processing Operations
This section explains what is happening in plain and technical terms.
It should cover:
- Categories of personal data
- Categories of data subjects
- Sources of data
- Data flows and storage
- Who has access to the data
- Retention periods
- Transfers to third countries
The goal is clarity, not legal complexity.
2. Purpose and Necessity Assessment
Here, the organization must justify:
- Why the processing is necessary
- Why less intrusive alternatives are insufficient
This is a critical step. If the same goal can be achieved with less personal data or lower risk, the current approach may violate GDPR principles such as data minimization and proportionality.
3. Risk Assessment to Rights and Freedoms
This section identifies:
- Potential harms to individuals
- Likelihood of those harms occurring
- Severity of impact if they occur
Risks should be evaluated realistically, not optimistically. Overlooking obvious risks is one of the most common compliance failures.
4. Measures to Address and Mitigate Risks
Finally, the DPIA must describe:
- Technical safeguards (encryption, access controls)
- Organizational measures (policies, training)
- Legal measures (contracts, safeguards)
- Privacy-by-design features
If residual high risk remains even after mitigation, the organization must consult the supervisory authority before proceeding.
DPIA Is Not a One-Time Exercise
A DPIA is a living document, not a static report.
It must be reviewed and updated when:
- Processing changes significantly
- New data categories are added
- Technology evolves
- New risks emerge
- Regulatory guidance changes
Failing to update a DPIA can be as problematic as failing to conduct one at all.
What Happens If High Risk Remains?
If, after implementing all reasonable safeguards, high risk cannot be sufficiently mitigated, Article 36 requires prior consultation with the supervisory authority.
This consultation must include:
- The DPIA
- Roles and responsibilities
- Risk mitigation measures
- Contact details of the DPO
Proceeding without consultation in such cases is a direct GDPR violation.
Consequences of Ignoring Article 35
Failure to comply with Article 35 can lead to:
- Administrative fines (up to €10 million or 2% of global turnover)
- Enforcement orders to stop processing
- Reputational damage
- Civil claims by affected individuals
Regulators increasingly use DPIAs as a benchmark to assess whether an organization took privacy seriously before harm occurred.
5 Practical Examples of GDPR Article 35 in Action
Example 1: AI-Based Recruitment Platform
A company introduces an AI system to automatically screen job applicants by analyzing CVs, video interviews, and behavioral patterns.
Why a DPIA is required:
The system involves profiling, automated decision-making, and potential discrimination risks.
Identified risks:
- Bias against certain demographics
- Lack of transparency
- Incorrect exclusion of candidates
Mitigation measures:
- Human review of decisions
- Bias testing and audits
- Clear explanations to applicants
Outcome:
DPIA completed, system adjusted, processing allowed with safeguards.
Example 2: Mobile Health Monitoring App
A startup launches a mobile app that tracks heart rate, sleep patterns, and physical activity for thousands of users.
Why a DPIA is required:
Large-scale processing of sensitive health data.
Identified risks:
- Data breaches
- Unauthorized access
- Misuse of health insights
Mitigation measures:
- End-to-end encryption
- Strict access controls
- Data minimization
- Short retention periods
Outcome:
Processing proceeds with enhanced security architecture.
Example 3: Facial Recognition in Retail Stores
A retail chain installs facial recognition cameras to detect repeat visitors and prevent theft.
Why a DPIA is required:
Biometric data processing and systematic monitoring in public spaces.
Identified risks:
- Misidentification
- Loss of anonymity
- Disproportionate surveillance
Mitigation measures:
- Clear signage
- Limited data retention
- Alternative non-biometric solutions considered
Outcome:
Project redesigned to remove facial recognition due to excessive risk.
Example 4: Employee Productivity Tracking Software
An employer deploys software that monitors keystrokes, screen activity, and time spent on applications.
Why a DPIA is required:
Monitoring of employees, power imbalance, and invasive data collection.
Identified risks:
- Psychological stress
- Excessive surveillance
- Chilling effect on workers
Mitigation measures:
- Limiting monitoring scope
- Transparency notices
- Aggregated reporting instead of individual tracking
Outcome:
System allowed with strict limitations and employee safeguards.
Example 5: Location Tracking for Delivery Drivers
A logistics company tracks drivers’ real-time locations to optimize routes.
Why a DPIA is required:
Continuous tracking and impact on freedom of movement.
Identified risks:
- Tracking outside working hours
- Misuse of location data
Mitigation measures:
- Tracking disabled off-shift
- Clear retention rules
- Purpose limitation
Outcome:
DPIA completed; processing deemed proportionate.
Key Takeaways
GDPR Article 35 is not about paperwork — it is about responsible decision-making.
A well-executed DPIA:
- Reduces legal risk
- Builds user trust
- Improves system design
- Prevents harm before it happens
Organizations that treat DPIAs as strategic tools rather than compliance burdens are far better positioned to operate safely, ethically, and sustainably in a data-driven world.