Home / Software / Apps / Security researchers latest to blast UK’s Online Safety Bill as encryption risk

Security researchers latest to blast UK’s Online Safety Bill as encryption risk

Nearly 70 IT security and privacy academics have voiced concern that the UK’s Online Safety Bill could undermine strong encryption if it isn’t amended.

In an open letter, 68 UK-affiliated security and privacy researchers warned the draft legislation threatens critical security technologies used to secure digital communications.

We develop online security technologies as independent information security and cryptography researchers. The academics warn that the Online Safety Bill threatens the safety provided by essential technologies like WhatsApp, Signal, and Element, which have said they would rather withdraw from the market or be blocked by UK authorities than compromise user security.

Apple warned last week that the Bill “seriously threatens” end-to-end encryption, calling it “a critical capability protection”. Apple warned that the bill could put UK citizens at risk without strong E2EE protections, despite its “safety” title.

Last year, an independent legal analysis of the draft legislation warned that its surveillance powers could compromise E2EE.

The House of Lords is currently reviewing the proposed legislation, and peers can propose amendments. Security academics hope their expertise will inspire second-chamber lawmakers to defend encryption where MPs have failed.

The Online Safety Bill is being discussed in the House of Lords before being returned to the Commons this summer. In summary, we worry about surveillance technologies being used to protect online privacy. This undermines privacy and online safety.”

The academics, who hold professorships and other positions at universities across the country, including Russell Group research-intensive institutions like King’s College and Imperial College in London, Oxford and Cambridge, Edinburgh, Sheffield, and Manchester, say they want to highlight “alarming misunderstandings and misconceptions around the Online Safety Bill and its interaction with privacy and security technology” with the letter.

The bill’s “routine monitoring” of people’s communications is supposed to combat the spread of child sexual abuse and exploitation (CSEA) content, but academics say it’s a sledgehammer to crack a nut that will harm the public and society by undermining critical security protocols.

They say routine monitoring of private communications is “categorically incompatible with maintaining today’s (and internationally adopted) online communication protocols that offer privacy guarantees similar to face-to-face conversations” and that “attempts to sidestep this contradiction” by using client-side scanning or “no one but us” crypto backdoors are “doomed to fail on the technological and likely societal level”.

“Technology is not a magic wand,” they say, before summarizing why the two ways to access protected private messages violate people’s rights to privacy and security.

“The history of ‘no one but us’ cryptographic backdoors is a history of failures, from the Clipper chip to DualEC,” experts warn. All technological solutions give a third party access to private speech, messages, and images under some criteria defined by that third party.”

The letter calls client-side scanning “placing a mandatory, always-on automatic wiretap in every device to scan for prohibited content” and disproportionate in a democratic society.

Client-side scanning technology cannot handle the bill’s expert analysis.

They write, “This idea of a ‘police officer in your pocket’ has the immediate technological problem that it must both be able to accurately detect and reveal the targeted content and not detect and reveal content that is not targeted, even assuming a precise agreement on what ought to be targeted.” Even client-side scanning technology designed to detect known CSEA has accuracy issues.

They also note recent research showing that such algorithms can be used for covert surveillance and hidden secondary capabilities like facial recognition.

Academics worry the bill will encourage platforms to run even more intrusive AI models to scan messages for CSEA content. They warn that such a technology is not “sufficiently reliable,” so if the bill is implemented, innocent message app users risk having their private messages widely viewed and even falsely accused of viewing CSEA.

“This lack of reliability can have grave consequences as a false positive hit means potentially sharing private, intimate, or sensitive messages or images with third parties, like private-company vetters, law enforcement, and anyone with access to the monitoring infrastructure. “This may constitute exploitation and abuse of those whose messages are being disclosed,” experts warn.

They also note that such “far-reaching” client-side scanning AI models would require a higher level of flexibility that would also make it easier for them to be repurposed—”to expand their scope, by compromise or policy change”—raising the rights-chilling prospect of embedded CSEA scanning technologies being expanded to detect other types of content and UK citizens being subject to steadily greater levels of state-mandated surveillance by default.

We requested a response from the Department for Science, Innovation, and Technology.

About Chambers

Check Also

The Air Force has abandoned its attempt to install a directed-energy weapon on a fighter jet, marking another failure for airborne lasers

The U.S. military’s most recent endeavor to create an airborne laser weapon, designed to safeguard …