How to Ensure GDPR Compliance When Using AI

Use AI? Follow our 6-step guide for using AI in compliance with the UK GDPR.

AI vs GDPR

Over the last few months, the Data Protection Made Easy podcast has reported growing privacy concerns using artificial intelligence (AI). Those developing AI systems rely on vast amounts of personal data to make them function, but whether this data is used appropriately is another cause for concern. 

We’ve already speculated about AI privacy threats, including data breaches, biased algorithms, deep fakes and cyber attacks. Taking a risk-based approach when handling AI will ensure you have the appropriate measures to mitigate these risks and stay compliant. 

In this blog, we’ll outline the critical considerations for businesses adopting AI and how to maintain GDPR compliance when using it. 

How to Use AI and Personal Data Lawfully 

1. Assess Business Use of AI Systems

You first need to assess how you are (or will be) using AI across your business. Are you using AI to streamline repetitive tasks or to make better decisions? Whatever your reason, you’re still processing personal data, so you must have a lawful basis for doing so.

2. Conduct a Data Protection Impact Assessment (DPIA)

The UK GDPR requires all businesses to do a DPIA if they process data that may result in high risk to the individual. As AI is considered a high-risk technology, we recommend carrying out a DPIA to identify and mitigate your AI’s privacy risks

DPIAs are an excellent way to show your accountability for the use and decisions made by the AI systems. In your assessment, you must detail how AI will collect, store and use personal data, the volume and sensitivity of the data, your role with the individuals and the expected outcomes. For more information, head to ICO’s guidance on DPIAs for AI

3. Respect Data Subject Rights

It’s difficult to understand how AI systems make decisions. But that doesn’t mean a data subject’s rights can be ignored. If you’re processing personal data in development or for everyday use, you must comply with the eight individual privacy rights referred to in the UK GDPR. 

Here are some ways you can comply with individual rights requests:

  • Right to be informed: You must be clear and open to individuals about what data you’re collecting from them and how/why you intend to use it. This is part of your transparency obligations, so disclose this privacy information before training or using an AI system. 
  • Right of access: If there is a subject access request (SAR), you can fulfil this by providing a copy of the individual’s data. It may be difficult to retrieve this if you’ve applied data minimisation techniques that have drastically changed the data from its original form. In these cases, ICO states that this data is no longer labelled as ‘provided’ (i.e., data an individual has consented to provide).
  • Right to rectification: You should already ensure data is accurate when processing, but if necessary, you will need to rectify it should individual inaccuracies appear. Changes to your data will likely not impact the performance of your AI system as it will be within a much larger dataset.
  • Right to erasure: You must consider all requests for erasure unless you process data as part of a legal or public obligation. You’ll need to erase all of the individual’s personal data from your dataset.
  • Rights to automated decision-making & profiling: You must notify individuals if their data is used for automated decision-making, including the logic, significance and impacts of processing. This would also be disclosed in a SAR. With appropriate safeguards, you give individuals the right to obtain human intervention, express their viewpoint, contest decisions and understand the decision logic.

4. Collect & Process Only the Data You Need 

Without data, there is no AI. Data powers AI systems, from testing a model’s intelligence to feeding machine learning models with information to learn from. 

But while AI thrives from mountains of data, the UK GDPR instead is strictly against it. Known as the data minimisation principle,  the UK GDPR says you should only collect the minimum amount of personal data needed to fulfil your purposes. No more, no less (well, less is always better). 

If you’re worried about processing personal data, just remember – only process what is needed. You could still have lots of data and remain compliant, as long as it is ‘adequate, relevant and limited’. 

If you’re developing an AI system, there are data minimisation techniques you can use to enhance privacy in the training phase. You could, for example, develop models using synthetic data or add ‘noise’ (aka ‘perturbation’) to the dataset so individual privacy is protected.  

5. Identify Bias and Discrimination Risks Early on 

AI works in a black box, free of common sense or ethical sensitivity. If an AI system is trained on skewed or biased data, the algorithms can sustain these biases without concern about how they’ll impact individuals. 

You should address these risks at an early stage to avoid discriminatory outcomes in hiring, facial recognition and so on. To mitigate bias, you must:

  • Evaluate whether the data collected from individuals is accurate, reliable, relevant, representative and current;
  • Identify the potential consequences of the AI system’s decisions on different groups to determine if they’re acceptable.  

Can AI be racist? Find out in our blog.

6. Receive External Support for Using AI 

If you use third-party AI systems like Chat GPT, you’re not immune from data protection laws. You will be a data controller, instead of a processor, so you’ll need to show how the AI system obeys UK GDPR. 

This is demonstrated through all our points above, such as conducting a DPIA, but you can also outsource a Data Protection Officer (DPO) to carry out all due diligence. 

At Data Protection People, our consultants specialise in cyber security and UK GDPR, making us an appropriate supplier to maintain compliance. 

Outsourced Data Protection & Cyber Security Consultancy 

Worried about AI and its impact on your data protection obligations? Data Protection People are here to help. Contact our team for expert advice today