Apple has warned that it would rather stop offering iMessage and FaceTime services in the U.K. than bowing down to government pressure in response to new proposals that seek to expand digital surveillance powers available to state intelligence agencies.
The development, first reported by BBC News, makes the iPhone maker the latest to join the chorus of voices protesting against forthcoming legislative changes to the Investigatory Powers Act (IPA) 2016 in a manner that would effectively render encryption protections ineffective.
Specifically, the Online Safety Bill requires companies to install technology to scan for child sex exploitation and abuse (CSEA) material and terrorism content in encrypted messaging apps and other services. It also mandates that messaging services clear security features with the Home Office before releasing them and take immediate action to disable them if required without informing the public.
While the fact does not explicitly call out for the removal of end-to-end encryption, it would de facto amount to weakening it as the companies offering the services would have to scan all messages to flag and take them down. This has been viewed as a disproportionate step that allows the government to enforce bulk interception and surveillance.
Apple told the British broadcaster that such a provision would “constitute a serious and direct threat to data security and information privacy.”
Earlier this April, a number of messaging apps that currently offer encrypted chats, such as Element, Signal, Threema, Viber, Meta-owned WhatsApp, and Wire, published an open letter, urging the U.K. government to rethink its approach and “encourage companies to offer more privacy and security to its residents.”
“The Bill provides no explicit protection for encryption, and if implemented as written, could empower OFCOM to try to force the proactive scanning of private messages on end-to-end encrypted communication services – nullifying the purpose of end-to-end encryption as a result and compromising the privacy of all users,” the letter read.
Apple, which previously announced its own plans to flag potentially problematic and abusive content in iCloud Photos, abandoned it last year after receiving pushback from digital rights groups over worries that the capability could be abused to undermine users’ privacy and security.
Shield Against Insider Threats: Master SaaS Security Posture Management
Worried about insider threats? We’ve got you covered! Join this webinar to explore practical strategies and the secrets of proactive security with SaaS Security Posture Management.
This is not the first time the tussle between end-to-end encryption vis-à-vis the need to tackle serious crimes online has cropped up.
In May 2021, WhatsApp sued the Indian government to block internet regulations that would compel the messaging app to break encryption by incorporating a traceability mechanism to identify the “first originator of information” or risk facing criminal penalties. The case is still pending.
Apple’s refusal to play ball is in line with its public stance on privacy, one that allows it to position itself as a “privacy hero” among other companies that thrive on collecting user data to serve targeted ads.
But it also rings hollow when considering the fact that every message sent to or received from a non-Apple device is unencrypted – SMS does not support end-to-end encryption – and could potentially open the door for government surveillance.