The UK government has ordered Apple to create a way for law enforcement to access encrypted accounts.
At first glance, it’s a move that seems like a straightforward law-and-order issue. But encryption, like so many technological advances, defies assumptions.
The UK’s demand for an encryption backdoor from Apple does not exist in a vacuum. It’s part of a growing global trend.
Are nations right to request (or demand) encryption backdoors for emergency measures? Techopedia explores the history and arguments — surely you have a right for your phone to be your own?
Key Takeaways
- Apple’s refusal to create encryption backdoors shows the tension between law enforcement needs and protecting user privacy.
- Encryption evolved from simple telegraph codes to complex digital systems.
- A locked used iPhone during a terrorist attack in 2015 sparked worldwide debate.
- Project Echelon demonstrated early large-scale government communications monitoring.
- Apple’s encryption system uses public-key cryptography and device-specific private keys to ensure only intended recipients can access messages.
Extreme Cases Show How Difficult the Privacy Debate Is
In 2016, a locked iPhone was at the center of a national security debate in America after the phone was used by one of the perpetrators of the San Bernardino terrorist attack.
The FBI demanded that Apple unlock an iPhone, which was a key part of the 2015 attack, which saw 15 people killed.
Apple declined the request, arguing that if the FBI could break an iPhone’s encryption, all other parties in the chain of ownership could do the same easily.
The case rumbled on for years, with the FBI eventually allegedly enlisting white hat hackers to hack into the phone.
A CBS News poll of a thousand Americans in 2016 found that 50% of the respondents supported the FBI’s stance, while 45% supported Apple’s stance.
This is an unsurprising split — the counterarguments between “unlocking a phone if it could save a life” and “absolute right to privacy” are tightly balanced. It’s possible to think both things to be true.
UK’s Snooper’s Charter May Be A Tougher Sell
It is the San Bernardino case that brought into focus the privacy and security debate all over the world.
But the UK’s current push for an Apple backdoor is different. It’s fueled by a law called the Investigatory Powers Act of 2016, sometimes called the ‘Snooper’s Charter,’ which gives the government sweeping powers to demand access to private communications.
During the 19th century, messages were sent by telegraph, and if you wanted to keep a message private, you had to trust the telegraph operator or use coded language. It was as simple as encryption: a message only made sense to those with the key.
Flash forward to the 1970s; the British government had a problem. Spies, double agents, and coded messages were slipping through the cracks. To counter this, they launched an unprecedented operation: they tapped every transatlantic telegram in and out of the country. It was called Project Echelon.
Project Echelon remained a closely guarded secret for decades until investigative journalists uncovered its existence (for example, John Parker’s book Total Surveillance), bringing it into public awareness.
Additional coverage of the story came from the documentary ‘I Spy: With My Five Eyes’ which looked at surveillance networks around the world.
The Australian television drama Pine Gap which came to Netflix in 2018 was based on similar ideas, exploring the tensions and suspense in intelligence operations through drama.
A Backdoor to Privacy
Fast-forward to today, and encryption is everywhere. It’s in our banking apps, medical records, and private conversations. And yet, it remains invisible until a government demands a backdoor.
We assume that security and privacy are opposing forces. To be safe, we must be watched.
But encryption tells a different story. It suggests that safety and privacy are not opposites but deeply intertwined. A backdoor for one government is a backdoor for all. Criminals can just as easily use a tool designed to catch criminals for their own purposes.
So we find ourselves in a paradox: the very thing designed to protect us — encryption — can be seen as a threat.
How Apple’s End-to-End Encryption Works & Why It Is Under Fire
To understand why the U.K. wants a backdoor, we must understand what it’s trying to break into.
Apple’s end-to-end encryption (E2EE) ensures that only the sender and receiver can read messages or access data. Not even Apple can decrypt it.
The main elements are:
- Public-key and symmetric-key encryption: When you send an iMessage, your device generates a pair of encryption keys: one public, one private. The public key is stored on Apple’s servers, while the private key stays on your device. This ensures that only you can read messages sent to you.
- Automatic encryption: Messages and FaceTime calls are encrypted before leaving your device and decrypted only on the recipient’s device.
- iCloud protection: Some iCloud data can also be encrypted end-to-end (E2EE), meaning even Apple can’t access it unless the user doesn’t choose to opt in and switch it on, resulting in less secure backups.
The Apple Backdoor: Creating Vulnerabilities
In December 2023, Apple was informed about the ‘GoFetch’ attack, which revealed that M-series chips have a vulnerability that could allow attackers to extract encryption keys from Macs.
While it is important to mention that Apple does not disclose its encryption standard for E2EE, this was a reminder that even the best security systems have weaknesses.
It isn’t about exploiting the vulnerabilities; the UK’s demand is about creating them as a backdoor.
The UK has stated its position that law enforcement needs encryption access to fight crime and terrorism.
However, by its very essence, encryption is intended to secure information and ensure it remains confidential so that, in this example, not even Apple can access it. And so, the debate rages on across decades, continents, and courtrooms.
But here’s the surprising part: encrypted data often tells us more than we expect, not less.
While governments and tech companies argue whether messages should be accessible, law enforcement often gleans insights from metadata, such as who is talking to whom, how frequently, and when. This metadata can be as revealing as the content itself.
A 2013 study by Stanford researchers found that analyzing metadata alone could reveal intimate details about people’s lives, religious beliefs, medical conditions, and even romantic relationships. This defies the common assumption that encryption makes all digital communication completely opaque.
Yet, governments continue to push for direct access to message content, believing some secrets are too dangerous to remain hidden.
The Bottom Line
The UK’s demand for encrypted data is framed as a fight against crime. But there’s a paradox: weakening encryption also weakens it for cybercriminals.
Imagine a city where every house has the same master key held by law enforcement. If that key gets stolen, everyone is at risk. That’s essentially what a government ‘backdoor’ into encryption does: it creates a single point of failure.
And the debate isn’t just about technology — it’s about values. Should we prioritize security at the cost of privacy? Or does privacy create security by ensuring that no one, not even governments, can exploit our data?
One thing is clear: encryption is no longer just a technical issue. It’s a societal one that talks about how we communicate, govern, and protect our most personal information.