The Growing Privacy Dilemma

Written by Catarina Santos

End-to-end encryption is often framed as a purely technical feature but it seems more than that: Meta’s decision to remove it from Instagram, alongside TikTok’s refusal to adopt it, shows just how complex that balance has become. The challenge ahead is not simply whether to use encryption, but how to reconcile two equally important goals:

End-to-end encryption, Instagram, and the growing privacy dilemma

End-to-end encryption, Instagram, and the growing privacy dilemma

Meta has announced that it will remove end-to-end encryption from Instagram direct messages, raising questions not just about privacy, but about safety, regulation, and where the balance should sit.

What is end-to-end encryption?

End-to-end encryption is a way of protecting communications so that only the sender and the recipient can read the content. In simple terms, messages are “locked” on the sender’s device and can only be “unlocked” on the recipient’s device. No one else – not even the platform providing the service – can access the content of those messages (in theory).

This is widely considered one of the strongest forms of privacy protection available online. It reduces the risk of data breaches, unauthorised access, and surveillance.

However, it also means that platforms themselves cannot monitor what is being shared.

What is happening with Instagram?

Meta has confirmed that it will discontinue end-to-end encrypted messages on Instagram from May 2026. The feature, which was introduced relatively recently, allowed users to send messages that even Meta could not read. Its removal means that Instagram messages will no longer have this level of protection.

The company has suggested that the feature had low usage.

At the same time, there is a wider context. Regulators and policymakers – particularly in the UK (particularly with the Online Safety Act), US and Europe – have been placing increasing pressure on platforms to improve child safety and prevent harm online.

End-to-end encryption has become a focal point in that debate, because it limits the ability of platforms to detect illegal or harmful content.

TikTok’s contrasting approach

Interestingly, not all platforms are moving in the same direction. As Mark and I mentioned on the podcast a couple of weeks ago, TikTok has publicly stated that it does not intend to introduce end-to-end encryption for direct messages.

Its reasoning is clear: encryption of this kind can make it harder to detect harmful behaviour, including abuse and exploitation: without visibility of message content, both platforms and law enforcement may struggle to investigate concerns. In other words, TikTok is allegedly prioritising safety and oversight over maximum privacy in messaging.

The dilemma: privacy vs protection

This brings us to the core issue and one that organisations, regulators, and society more broadly are still grappling with. On this topic and as mentioned on the podcast, we are facing a massive dilemma:

On one hand, this mechanism is a powerful weapon for safeguards challenges, supporting confidentiality and reducing risks of hacking and misuse of data. On the other, it can limit the detection of harmful or illegal activity, can create challenges for safeguarding children and vulnerable users (that can be anyone, depending on the context as Charlotte mentioned on the podcast) and the big one being that reduces the ability of platforms to intervene proactively.

In the UK, this tension is reflected in legislation such as the Online Safety Act, which places duties on platforms to protect users – particularly children – from harm, while also raising concerns about how that can be achieved without weakening encryption.

From a data protection standpoint, this issue sits at the intersection of several key principles:

  • Confidentiality and security (protecting personal data)
  • Accountability (ensuring organisations can manage risks)
  • Protection of vulnerable individuals, particularly children

There is, however, one thing that is clear: there is no one-size-fits-all answer!!

Strong encryption aligns closely with UK GDPR principles around security and integrity; but organisations also have obligations to mitigate risks and prevent harm particularly where children are concerned.

The shift by Meta – and the contrasting stance from TikTok – highlights that there is no settled industry position.

End-to-end encryption is often framed as a purely technical feature but it seems more than that: Meta’s decision to remove it from Instagram, alongside TikTok’s refusal to adopt it, shows just how complex that balance has become. The challenge ahead is not simply whether to use encryption, but how to reconcile two equally important goals:

protecting people’s privacy, and protecting people from harm.

 Important links:

https://www.bbc.co.uk/news/articles/cly2m5e5ke4o

 https://mashable.com/article/instagram-meta-end-to-end-encryption