Home / News / Artificial Intelligence / Legal expert: UK’s Online Safety Bill’s surveillance provisions put E2EE at risk

Legal expert: UK’s Online Safety Bill’s surveillance provisions put E2EE at risk

Independent legal analysis of the Online Safety Bill, a contentious UK government proposal to regulate online speech under a safety-focused framework, says the draft bill contains some of the broadest mass surveillance powers ever proposed in a Western democracy and also warns that they pose a risk to the integrity of end-to-end encryption (E2EE).

Index on Censorship, an organisation that promotes freedom of expression, hired the lawyer Matthew Ryder KC of Matrix Chambers to write the opinion.

Ryder was tasked with determining whether or not the bill’s provisions adhere to human rights law.

He comes to the conclusion that the law, as it stands, lacks crucial protections for surveillance authorities, which means that, absent additional amendment, it will probably violate the European Convention on Human Rights (ECHR).

Parliamentary debate on the bill was put on hold over the summer and once more in October as a result of political unrest within the ruling Conservative Party. The government has said it intends to make changes to the draft following the appointment of a new digital minister and two changes of prime minister; however, these are concentrated on provisions related to so-called “legal but harmful” speech, rather than the enormous human rights gap highlighted by Ryder.
For a response to the problems indicated by his legal opinion, we contacted the Home Office.

In response, a government representative sent an email including a debunking statement from Tom Tugendhat, the minister of security:

“The Online Safety Bill assures that we are able to defend ourselves from online crimes, including child sexual exploitation, and places privacy at the center of its measures. No particular technology or service design is forbidden.

“It is appropriate that Ofcom, the independent regulator, has the authority, as a last option, to order these firms to take action where a corporation fails to address child sexual exploitation on its platforms.

“End-to-end encryption can be used in a way that is consistent with public safety, but strong encryption preserves our privacy and our online economy. The Bill makes sure that tech corporations don’t give the most dangerous predators online a safe haven.

The bill gives the state broad authority to require digital providers to monitor users’ online communications “on a generalized and widespread basis,” according to Ryder’s analysis, but it lacks any form of independent prior authorization (or independent ex post facto oversight) for the issuance of content scanning notices.

In Ryder’s opinion, the ECHR’s Articles 8 (right to privacy) and 10 (right to freedom of expression) would undoubtedly be violated by this lax oversight.

The (also extremely contentious) Investigatory Capabilities Act 2016 (IPA), which grants the UK security agencies very broad surveillance measures, does, however, have legal checks and balances for approving the most intrusive powers by including the judges in signing off intercept warrants.

The Online Safety Bill, however, leaves it up to the designated Internet regulator—a public entity that Ryder claims is insufficiently independent for this function—to decide whether to issue the most intrusive content scanning warrants.

The statutory structure does not provide for independent permission for 104 Notices even though it may require private bodies to carry out widespread state monitoring of millions of user’s communications – at the behest of a public entity. Additionally, there is no option for independent ex post facto oversight, the author writes. “In this scenario, we do not believe that Ofcom, the state regulator, can be regarded as an independent entity.”

The “bulk surveillance” of online communications envisioned by the Online Safety Bill may not pass another important human rights test of being “necessary in a democratic society,” he adds, given the existing vast surveillance powers granted by the IPA.

While the Online Safety Bill, which his legal analysis claims grants similar “mass surveillance” powers to Ofcom, covers a much wider range of content than strictly national security issues, bulk surveillance powers under the IPA must be linked to a national security concern and cannot be used solely for the prevention and detection of serious crime between UK users. So it appears much less bound.

Ruth Smeeth, the CEO of Index on Censorship, criticized the bill’s overreach in a statement in response to Ryder’s legal opinion, writing:

This legal opinion explains the plethora of problems with the Online Safety Bill. Due to the legislation’s ambiguous wording, the media regulator Ofcom will be forced to make independent decisions about how to use sweeping surveillance powers that would cover practically every element of British daily life online. Perhaps the most glaring example of regulatory excess in a bill that is unsuitable for purpose is surveillance.

Influence on E2EE
Although risks to freedom of expression have received the most of the attention in the debate surrounding the Online Safety Bill, which was published in draft form last year but has continued to be altered and broadened in scope by the government, there are a number of other noteworthy issues. Included in this discussion is how the law’s content scanning requirements would affect E2EE, with detractors like the Open Rights Group arguing that it will essentially coerce service providers into weakening strong encryption.

After a government amendment in July, which suggested new powers for Ofcom to require messaging platforms to use content-scanning technologies even if communications are strongly encrypted on their service, concerns have increased since the bill was introduced. A regulated service may be compelled to use “best endeavors” to develop or source technology for detecting and deleting CSEA in private communications, which puts it in conflict with E2EE. This was made clear in the amendment.
The “gold standard” for encryption and online security, E2EE is still used by popular messaging services like WhatsApp, iMessage, and Signal, to name a few. E2EE offers users’ online communications with the necessary protection and privacy.

Therefore, any legislation that restrict the usage of this standard or expose new security holes in E2EE could have a significant global impact on online users’ security.

The Online Safety Bill’s content scanning clauses, which pose an existential risk to E2EE, are the subject of Ryder’s legal opinion.

The majority of his legal analysis is focused on Clause 104 of the bill, which gives the designated Internet watchdog (current media and communications regulator, Ofcom) new authority to issue notices to in-scope service providers requiring them to identify and remove terrorism content that is communicated “publicly” through their services or Child Sex Exploitation and Abuse (CSEA) content being communicated “publicly or privately.” Again, where things start to get really tricky with E2EE is with the inclusion of “private” communications.

To comply with the 104 Notices issued by Ofcom, Ryder believes that rather than requiring messaging platforms to completely abandon E2EE, the bill will push them toward deploying a contentious technology called client side scanning (CSS). Ryder predicts that CSS will “likely” be the primary technology whose use is mandated.

“Clause 104 does not specifically mention CSS (or any other technology). Only “accredited technology” is mentioned. “The Bill notes that the accredited technology referred to in c.104 is a form of “content moderation technology,” which is defined as “technology, such as algorithms, keyword matching, image matching,” he continues. “However, the practical implementation of 104 Notices requiring the identification, removal, and/or blocking of content leads almost inevitably to the concern that this power will be used by Ofcom to mandate CSPs [communications service providers] using some form of CSS,” he writes (11). This explanation is consistent with CSS.

In addition, he cites a piece written this summer by two senior GCHQ officials who “endorsed CSS as a potential solution to the problem of CSEA content being transmitted on encrypted platforms,” adding that their remarks were made “against the backdrop of the ongoing debate about the OLSB [Online Safety Bill].”

Any effort to force CSPs to limit the use of end-to-end encryption generally would have significant effects on the safety and security of all international online communications. We are unable to think of any circumstances that could justify taking such a detrimental step toward the security of worldwide online communications for billions of users, the author continues.

Client-side risk analysis
CSS is a contentious scanning technology that scans encrypted communications’ content in an effort to find problematic material. Before being encrypted and transferred, a communication is transformed into a cryptographic digital fingerprint. This fingerprint is then compared with a database of fingerprints to look for any matches with known undesirable content (such as CSEA). The user’s own device or a distant service might be used for the comparing of these cryptographic fingerprints.

Regardless of the analogy, privacy and security experts contend that CSS undermines the E2E trust paradigm since it fundamentally undermines the “zero knowledge” goal of end-to-end encryption and creates new dangers by exposing uncharted attack and/or censoring channels.

For instance, they draw attention to the possibility of embedded content-scanning infrastructure enabling “censorship creep” as a state could mandate communications providers scan for an ever-widening range of “objectionable” content (ranging from copyrighted material all the way up to expressions of political dissent that are unpopular with an autocratic regime, since tools developed within a democratic system aren’t likely to be used in only one place in the world).

When Apple stated that it will start checking uploaded iCloud Photos for known child abuse images last year, it attempted to implement CSS on iOS customers’ devices. This move sparked a massive reaction from privacy and security experts. Apple appears to have given up on the project after pausing, then subtly removing, mention of it in December. However, legislation like the UK’s Online Protection Bill, which uses the same purported child safety argument to embed and enforce content scanning on platforms, could resurrect similar initiatives by requiring the implementation of CSS.

Notably, the UK Home Office has been actively encouraging the development of content-scanning technologies that could be used with E2EE services. Last year, it announced the creation of the “Tech Safety Challenge Fund” to spend taxpayer money on the creation of what it billed as “innovative technology to keep children safe in environments like online messaging platforms with end-to-end encryption.”

Five winning projects were revealed as a result of that challenge in November. These prototypes’ level of “development” and/or accuracy is unclear. But despite the lack of progress in such technology, the government is pushing ahead with online safety laws that, according to one legal expert, will in fact oblige E2EE platforms to perform content scanning and promote the use of CSS.
Ryder predicts that any such solution would likely be CSS or something similar when speaking about the government’s proposed amendment to Clause 104, which would allow Ofcom to require communications service providers to “use best endeavours” to develop or source their own content-scanning technology to achieve the same goals as accredited technology, which the bill also envisions the regulator signing off. Instead, we believe it is extremely improbable that CSPs would want to completely disable end-to-end encryption on their services. They would still have to examine messages’ substance in order to find pertinent information. The security of their users and their platforms would be irreparably compromised, which would very surely lead to many people switching to other services.

“[I]f 104 Notices were issued across all eligible platforms, this would mean that service providers would be continuously monitoring the content of practically all internet-based communications by millions of people, including the specifics of their private conversations. It will obviously rely on how Ofcom uses its authority to issue 104 Notices as to whether this occurs, but it is obvious that there is a conflict between the apparent goal and the requirement for proportionate usage.

In order to enforce compliance with the Online Safety Bill, extremely big sticks are being built and placed in place with broad surveillance capabilities. Failure to comply will subject service providers to a range of harsh fines.

The proposed law would authorize sanctions up to 10% of yearly global sales, or £18 million, whichever is higher. Additionally, the bill would give Ofcom the legal authority to ask a judge for “business disruption measures,” such as the barring of non-compliant services in the UK market. Senior executives at providers who don’t help the regulator may face criminal charges.

For its part, the UK government has thus far dismissed worries about how the law may affect E2EE.

A government fact sheet states that Ofcom would only need content scanning technology “as a last resort” in the section on “private chat networks.” The same statement makes the claim that these scanning technologies will be “very accurate” without offering any proof. Furthermore, it says that “use of this ability will be subject to tight restrictions to protect users’ privacy,” and it continues, “Highly precise automated techniques will verify that legal content is not harmed. To utilize this authority, Ofcom must be certain that no other actions would be equally effective and that there is proof of a serious issue with a service.

The idea that cutting-edge AI will be “very accurate” for a broad-based content scanning purpose at scale is obviously dubious, and it needs solid proof to support it.

You only need to take into account how ineffective AI has shown to be for content moderation on popular platforms, which is why there are still thousands of human contractors working as report reviewers for automated systems. Therefore, it seems highly improbable that the Home Office would be able to promote the creation of a much more effective AI filter than what tech behemoths like Google and Facebook have been able to create over the course of the last few decades.

Regarding restrictions on the use of content scanning notices, Ryder’s opinion mentions safeguards found in Clause 105 of the bill, but he doubts that they are adequate to address the full range of human rights concerns associated with such a strong power.

Other safeguards are present in Clause 105 of the OLSB, but whether they will be adequate depends on how they are implemented in practice, he says. “At this time, it is unclear how Ofcom will implement such protections and restrict the scope of 104 Notices.

By mandating that sufficient attention be given to any interference with the right to freedom of expression, for instance, Clause 105(h) relates to Article 10 of the ECHR. However, there is no explicit clause guaranteeing the sufficient security of journalistic sources, which must be included to avoid a violation of Article 10.

In additional comments in response to Ryder’s opinion, the Home Office emphasized that Section 104 Notice powers will only be utilized where there are no other, less intrusive measures capable of achieving the necessary reduction in illegal CSEA (and/or terrorism content) appearing on the service. The regulator will be responsible for determining whether issuing a notice is necessary and proportionate, taking into account factors outlined in the legislation, such as the need to protect the public interest.

About Chambers

Check Also

Bluesky has now made it possible for heads of state to join the social network

Over the weekend, Bluesky, the social networking platform, decided to allow sign-ups for heads of …