Location Data for Sale: A Wake-Up Call for UK Organisations

Learn why location data counts as personal data under UK GDPR, the risks of “anonymised” tracking, and how UK organisations can stay compliant.

Location Data for Sale A Wake-Up Call for UK Organisations

Location Data for Sale: A Wake-Up Call for UK Organisations

A recent RTÉ Prime Time investigation exposed how the real-time movement of tens of thousands of smartphones was being sold on the open market. The story, though focused on Ireland, is a stark warning for UK organisations that process or share location data. If location data can be traced back to individuals, it is personal data under UK GDPR. Misusing it could lead to serious enforcement action and loss of public trust.

What Happened

Undercover journalists posed as a data analytics company and purchased location data showing two weeks of movement for around 64,000 mobile phones. The dataset revealed daily routines, routes and even visits to sensitive sites like government buildings and prisons. Despite claims of “anonymisation”, investigators easily re-identified users by tracing data to home addresses and workplaces.

In response, Ireland’s Data Protection Commission launched an investigation into the data broker’s practices. The case mirrors ongoing global concerns about the misuse of mobile location data, issues that are equally relevant under UK GDPR and PECR.

Location Data as Personal Data

UK GDPR explicitly treats location information as personal data. In its definition of personal data, the UK GDPR lists “location data” alongside names and online identifiers. In practice, this means a person’s physical movements, whether by GPS, Wi-Fi or cell towers, identify them and are protected. GDPR examples of “private and subjective” data include location data on the same list as religion or political views. In other words, even though raw GPS coordinates aren’t a “special category”, location trails can quickly become as revealing as declared sensitive information.

  • Location data comes with high responsibility: organisations must treat it carefully under UK GDPR’s principles (lawfulness, purpose limitation, data minimisation, etc.). They should be transparent, provide clear privacy notices, and obtain valid consent or other lawful basis before tracking.

Location Data Can Reveal Sensitive Details

Long-term tracking of movement patterns can expose highly personal traits. For example, ICO guidance emphasises that a 24/7 log of someone’s whereabouts is “highly intrusive”, as it “is likely to reveal a lot of information about them, including the potential to infer sensitive information such as their religion, sexuality, or health status.”

FTC regulators in the US have made similar points. In a complaint against a location-broker, the FTC noted that “Location data can expose sensitive information such as medical conditions, sexual orientation, political activities, and religious beliefs.”

In practice, detailed location logs can be cross-referenced with public data to infer private traits. For example, regular attendance at a particular church or mosque can reveal faith, frequent visits to a clinic or mental-health centre can imply medical issues, and patterns of travel to political rallies or social venues can hint at ideologies or sexuality.

  • Examples of sensitive inferences: A person’s home, work, places of worship, or health clinics are obvious “sensitive” sites. Data brokers have sold segments like “pregnant women” or “people going to abortion clinics” by detecting patterns in GPS data.
  • Risk of profiling and ads: Online ad networks also use location to profile users. Under UK law, using tracking data for targeted advertising requires strict consent. However, in reality many apps leak precise location to marketing firms. Investigations found that even innocuous apps (games, fitness or prayer apps) have been co-opted to harvest location data for sale. This means a user may see ads not only for local restaurants, but also for sensitive services, such as medical treatments, based on inferred profile.

Re-identifying “Anonymous” Location Trails

Simply stripping names off GPS data is not enough to make it safe. Mobility records are notoriously unique. The EU’s data protection board warns that supposedly “anonymised” location traces “are known to be notoriously difficult to anonymise.” They cite research showing that even a few points of a person’s movement make them re-identifiable.

In one landmark study, only four random spatio-temporal points (latitude/longitude plus time) were enough to uniquely identify 95% of individuals in a large mobility dataset. Even coarse data (such as cell-tower regions and hours rather than exact GPS minutes) proved only marginally safer, most people remained unique with just a handful of points. In short, an “anonymised” location database can often be re-linked to individuals by matching with outside information, such as known home or work addresses or social media check-ins.

User Consent Issues

Beyond official cases, everyday privacy concerns arise with location tracking:

  • Mobile App Permissions: Many smartphone apps request location permission (for “better experience” or ads) and users often grant it without realising. Studies show thousands of popular apps, even games or utility apps, leak location via ad networks. In many cases users are unaware their movements are shared with marketing brokers.
  • Behavioural Advertising: Companies build profiles from location info. Under UK law, using tracking cookies or device signals for targeted advertising requires clear consent. However, some websites push “cookie walls” or confusing consent banners (a form of “dark pattern”) to force acceptance. ICO guidance warns that mandatory “take-it-or-leave-it” consent (no free choice) is usually invalid.
  • Surveillance Advertising: Location-based surveillance advertising, showing ads based on precise location behaviour, poses GDPR challenges. For instance, an ad network could infer health or beliefs (e.g. showing ads for political causes to someone who visited a rally). ICO guidance is clear that any profiling of user attitudes or preferences, which location-based targeting does, requires transparency and consent.

What You Should Be Doing Now

Principles for Responsible Processing

  • Necessity and Justification: Only collect location if essential for the service. As the ICO puts it, tracking people’s movements “requires a strong justification”. Consider less intrusive alternatives first.
  • Consent and Notice: Be clear with users why you need location data, how you use it, and get valid consent when profiling or advertising. Avoid dark patterns in consent requests.
  • Data Minimisation and Retention: Store the minimum location detail needed, for example use coarse location if possible, and retain it only as long as required. Given the risk of re-identification, controllers should destroy or truly anonymise logs when no longer needed.
  • Security and Access Controls: Because location data is sensitive, it must be well secured, with encryption and strict access controls. Log who accesses location information, and have a robust breach response plan.
  • Right to Object: Remember that data subjects have the right to object to profiling. Companies should provide easy ways for users to opt out of location-based tracking or data sharing.

By following these principles and keeping abreast of ICO and EDPB guidance, organisations can handle location data more responsibly. The Home Office case shows regulators will scrutinise any 24/7 monitoring. With “always-on” location services on our phones and devices, businesses and governments alike must respect that location trails reveal the contours of people’s private lives.

Practical Steps

  • Audit your data flows – Map out all sources and uses of location or behavioural data, including mobile apps, analytics tools and advertising platforms.
  • Review contracts and suppliers – If you use data brokers or adtech partners, ensure they comply with UK GDPR and do not sell or re-use data unlawfully.
  • Strengthen anonymisation practices – Follow the ICO’s Anonymisation and Pseudonymisation Guidance to assess and document re-identification risks.
  • Refresh consent and transparency notices – Make sure privacy notices clearly explain any sharing or selling of location data, including the lawful basis for doing so.
  • Carry out a DPIA – Conduct a Data Protection Impact Assessment for any project involving tracking or profiling users through location or behavioural data.
  • Train staff and developers – Everyone involved in collecting or processing location data should understand their obligations and the potential risks.

At Data Protection People, we help organisations conduct DPIAs, assess anonymisation standards, and audit third-party data flows. If your organisation collects or shares location data, now is the time to act before regulators come knocking.

Our View / Final Thoughts

The RTÉ revelations underscore a growing issue: location data is among the most valuable, but also the most dangerous, forms of personal data. For UK businesses, this means tightening internal controls, demanding transparency from suppliers, and taking accountability seriously. “Anonymous” data is not always anonymous, and claiming so will not protect you from enforcement.

The ICO has already signalled a tougher stance on data brokers, consent mechanisms, and dark patterns. Organisations that proactively embed privacy-by-design and transparency will not only avoid penalties, but also strengthen customer trust in an era of growing data awareness.

FAQs

Does UK GDPR treat location data as personal data?

Yes. Location data can directly or indirectly identify an individual, which makes it personal data under Article 4 of the UK GDPR.

Is selling anonymised data allowed in the UK?

Only if it is genuinely anonymous and cannot be re-identified. If there is any realistic possibility of re-identification, it remains subject to UK GDPR.

What if our organisation uses third-party analytics tools?

You remain responsible for compliance. Review contracts, verify privacy practices, and complete DPIAs where tracking or profiling occurs.

Has the ICO fined organisations for data misuse before?

Yes. Examples include Experian’s enforcement notice (2023) and Clearview AI’s £7.5 million fine (2022) for unlawful data scraping. Location data misuse could attract similar penalties.

What support is available?

If you’re unsure about your obligations, Data Protection People’s support services can help with audits, DPIAs and policy reviews.

If you process, share or purchase location data, take action now. Our team at DPP can help ensure your practices are compliant, ethical and defensible.

References and Useful Sources