International Technical Support (EU): +44 (20) 80891215 & (US): +1 312 248 7781 | support@trustcloud.tech
Login

Privacy, surveillance and backdoors: the UK puts Apple to the test

Share This:

TrustCloud | Privacy, surveillance and backdoors: the UK puts Apple to the test

The UK’s Investigatory Powers Tribunal has made an unusual yet highly significant decision: it has ruled that the government cannot keep secret an order that sought to compel Apple to create a “backdoor” into its cloud storage system, iCloud.

I

n other words, the court insists that the existence of the measure be disclosed publicly before granting access to Apple’s storage, as it affects fundamental rights such as digital privacy and raises questions about the scope of state power.

This episode, made public on 7 April, not only sets an important precedent in the ongoing legal battle over encryption, but also exposes an inevitable clash between three forces: individual privacy as a right, the technological sovereignty of global companies, and the power of the state to conduct surveillance in the name of security. 

The ruling has been widely received as a victory for Apple, once again positioning the company as a defender of user privacy. However, this episode should not be taken as definitive proof of its integrity. While Apple has indeed resisted certain forms of state pressure, it has not always acted transparently, nor has it consistently placed privacy above all else. 

Technology vs the State: a structural tension 

The relationship between governments and major tech companies is one of constant friction. In the name of national security, counterterrorism or tackling organised crime, governments regularly demand privileged access to encrypted digital communications. Tech companies, in turn, increasingly argue that such access — often referred to as backdoors — undermines user security and erodes trust in their services. 

Apple has been one of the most prominent players in this ongoing confrontation. It has made user privacy a core pillar of its brand and has historically resisted government requests to weaken encryption or introduce covert access points into its systems. One of the most emblematic cases was in San Bernardino in 2016, when the FBI requested access to the iPhone of a suspected attacker. Apple refused publicly, stating that creating a tool to bypass its own security would be dangerous and could be misused against anyone. 

The recent case in the UK fits squarely into this same ongoing struggle. The government sought to force a backdoor into iCloud while keeping the order entirely secret. Apple opposed not just the demand for access, but also the lack of transparency around the process. 

This global conflict between public security and digital privacy has no easy resolution. But one thing is clear: the state is no longer the only actor with the power to protect — or violate — individual rights. Tech companies have become guardians — and at times, arbiters — of access to the private lives of millions. 

Privacy champions? A narrative with cracks 

Let’s be clear: Apple has built a powerful image — that of a company placing user privacy at the very heart of its business model. Its recent advertising campaigns focus less on technical innovation and more on data protection. However, this narrative begins to show cracks, especially when contrasted with certain episodes in the company’s history. 

In 2019, it emerged that Apple had allowed third-party contractors to listen to Siri recordings, many of which had been activated unintentionally. These recordings sometimes contained sensitive details of users’ private lives and were analysed without clear and explicit consent. Although Apple temporarily suspended the programme and later introduced stricter controls, the damage to its credibility had already been done: there had been unintentional surveillance within a system marketed as “private by design”. 

Another key moment came in 2021, when Apple announced an automatic scanning system for iCloud photos to detect child sexual abuse material (CSAM). While the objective was legitimate, the method sparked strong backlash from privacy experts and digital rights organisations. The concern: that such technology could pave the way for censorship or mass surveillance, especially in authoritarian contexts. Under pressure, Apple paused the project — and eventually abandoned it altogether in 2023. 

And we don’t need to look far. In countries like China, Apple has complied with government demands to store user data on state-controlled servers and remove politically sensitive apps from its store. This double standard raises an uncomfortable question: how firm is a company’s commitment to privacy when it clashes with commercial expansion or local legal requirements? 

Even within its own ecosystem, Apple has faced criticism for limiting third-party tracking while maintaining its own personalised advertising system, which also relies on user data. 

These examples show that Apple’s defence of privacy has not always been consistent — nor entirely altruistic. Which is why public scrutiny shouldn’t be limited to governments alone but should extend to the practices of those who build and control the technologies we rely on every day. 

The transparency dilemma 

When a government requests access to its citizens’ data from a tech company, should it be allowed to do so in secret? And to what extent are companies like Apple obliged to inform users when their data may be subject to surveillance? 

These questions lie at the heart of the UK case, but their relevance goes far beyond one jurisdiction. Transparency is not just an ethical aspiration — it is a democratic tool that allows society to assess whether power, whether state or corporate, is being exercised legitimately. 

The British government’s attempt to keep its surveillance order under wraps reveals a dangerous trend: normalising unaccountable surveillance. But it’s also not enough for tech companies to oppose such orders privately. Citizens have the right to know when, how and why their privacy is being undermined. 

When neither governments nor platforms communicate clearly, public trust begins to erode. And without trust — built on transparency — there can be no real protection of rights, no matter how robust the encryption. 

A final note 

The defence of privacy cannot be blindly entrusted to either governments or corporations, no matter how innovative or well-intentioned they may seem. Surveillance — whether public or private — must always be subject to clear limits, public oversight and civic accountability. Transparency is not a courtesy; it is a democratic obligation. 

At TrustCloud, we believe technology should serve people — not dominate them. That’s why our commitment to human rights and data privacy, through our advanced digital identity solutions, is unwavering. 

If you share this vision, we invite you to discover how we help responsible organisations build trust — through technology, with ethics

Back To Top

International Technical Support (EU): +44 (20) 80891215 & (US): +1 312 248 7781 | support@trustcloud.tech