Encryption technology used by Ukraine to defend against the Russian invasion could be placed at risk by measures in the Online Safety Bill currently passing through Parliament.
The CEO of a company that supplies end-to-end encryption (E2EE) technology used in Ukraine has warned that proposals in the bill to require automatic scanning of messages before they are encrypted could undermine military operations in the country.
Matthew Hodgson, CEO of encryption specialist Element, told Computer Weekly that provisions in the Online Safety Bill to identify communications related to terrorism could be subverted by state hackers to identify opponents’ military strength.
“Imagine that you are in Ukraine and you are using Element to communicate with the Ministry of Defence, and suddenly the Brits think it’s a good idea to start stockpiling every message that makes a reference to bombs. If you are Russian, you are obviously going to throw everything you can at accessing that archive of information,” he said.
Hodgson was speaking as the National Crime Agency (NCA) and partner law enforcement agencies stepped up criticism of Facebook owner Meta over its plans to extend end-to-end encryption on its messaging services.
In a statement released to coincide with the passage of the Online Safety Bill through the House of Lords, the NCA, part of the Virtual Global Taskforce of 15 law enforcement agencies, said Meta was making a “purposeful design choice” that would weaken its ability to keep children safe from abuse.
The statement said E2EE had a “devastating impact” on law enforcers’ ability to identify, pursue and prosecute offenders when implemented in a way that affects the detection of child abuse.
Client-side scanning
The Online Safety Bill will give the regulator, Ofcom, powers to require communications companies to install technology, known as client-side scanning (CSS), to analyse the content of messages for child sexual abuse and terrorism content before they are encrypted.
The Home Office maintains that client-side scanning, which uses software installed on a user’s phone or computer, is able to maintain communications privacy while policing messages for criminal content.
But Hodgson told Computer Weekly that Element would have no choice but to withdraw its encrypted mobile phone communications app from the UK if the Online Safety Bill passed into law in its current form.
Element supplies encrypted communications to governments, including the UK, France, Germany, Sweden and Ukraine.
“There is no way on Earth that any of our customers would every consider that setup [client-side scanning], so obviously we wouldn’t put that into the enterprise product,” he said.
Matthew Hodgson, Element
“But it would also mean that we wouldn’t be able to supply a consumer secure messaging app in the UK. It would make a mockery of our position as a secure communications supplier,” he added.
If that were to happen, the UK would join China as the only country to have effectively banned Element’s encrypted communications service.
Other encrypted communications services, including WhatsApp and Signal, have indicated that they would no longer be able to provide encrypted messaging services in the UK if the Online Safety Bill goes ahead in its current form.
Privacy violation
Hodgson said politicians in the UK had been wrongly taken in by claims from scanning software companies that client-side scanning is a “silver bullet” that can reliably scan for abusive content without destroying privacy.
Even though CSS does not break encryption in transit, the technology means the privacy of users is “completely violated” by exposing messages to analysis by third-party moderators either before encryption or after decryption.
“It is very similar to mandating that you must have a government-supplied CCTV camera in every room of your house that will be running 24/7. It will use an unknown algorithm to detect bad things, which get reported to a private moderation team provided by the people who built your house,” he said.
“We would never accept that in real life, and just because you can technically implement that in a software environment does not mean it is the right answer,” he added.
Hacking risks
Even if the system were to operate perfectly, CSS creates new security risks that can be exploited by hackers, who could gain access to the technology, insert new rules, and potentially access a “huge honeypot” of data exfiltrated from encrypted communications and collected by the moderation team.
“If you are a child abuser and you want to gain access to child abuse content, well [with client-side scanning] you have just created a mechanism that aggregates it in one place and allows bad actors to scan through it,” he said.
Hodgson argues that client-side scanning of encrypted messaging is not necessary to detect terrorism and child sexual abuse, as offenders are likely to leave fingerprints of their activities on the internet.
“People who publish such material have to be discoverable, and the second that they are discoverable, they are exposing themselves by leaving a breadcrumb trail that investigators can follow,” he said.
Investigators are able to track down paedophiles through undercover work on the internet, joining communities or using artificial intelligence-controlled bots to interact with offenders. “It is that sort of approach that we use today and it works relatively effectively,” he said.
The only scenario where these techniques are not effective is where an offender acts as a “lone wolf” targeting people on the internet. But the risk of lone wolves is just as true in the physical world.
“What do you do? Do you give the police blanket powers to break into people’s rooms at random if they suspect anything whatsoever? Or do you educate kids to make sure that this is bad and they should report it?” he said.
Technology companies can also use metadata from encrypted communications to identify potential offenders, including lone wolves, by identifying suspicious communications, which can be further investigated by law enforcement acting under warrant.
“A good example is if you have a user in their 50s who keeps contacting a child at four o’clock in the morning and seems to be sending images back and forth,” he said.
Judicial approval
Tim Clement-Jones, a Liberal Democrat peer, has filed amendments to the Online Safety Bill in an attempt to seek more clarity on the government’s plans for monitoring messages sent using end-to-end encryption.
Section 110 of the bill gives Ofcom the ability to issue technology notices, requiring private messaging services to put in place “accredited” tech to filter the content of messages, including private messages sent by mobile phone.
Clement-Jones’s amendment will require the regulator to seek approval from a judge before issuing a technology notice, in an attempt to ensure that the privacy of the service users is considered and that the measures are proportional.
A second amendment seeks to establish whether Ofcom will have to satisfy the Regulation of Investigatory Powers Act 2000, which governs surveillance, before giving a technical notice to an end-to-end encrypted messaging service.
A legal opinion commissioned by Index on Censorship from Matthew Ryder KC, published in November 2022, found that technical notices issued by Ofcom amount to state-mandated surveillance on a mass scale.
“Ofcom will have a wider remit on mass surveillance powers of UK citizens than the UK’s spy agencies, such as GCHQ (under the Investigatory Powers Act 2016),” wrote Ryder.
The surveillance powers proposed by the Online Safety Bill were unlikely to be in accordance with the law and would be open to legal challenge, he said. “Currently, this level of state surveillance would only be possible under the Investigatory Powers Act if there is a threat to national security.”
Hodgson said that although getting legal sign-off from a judge would provide more checks and balances, it would not address the risks posed by the scanning infrastructure necessary to inspect messages on encrypted services – a “trojan horse” that could be commandeered by hackers or hostile states to access the content of encrypted messages.
Element relies on open source software, reproducible builds and a secure bill of materials to ensure its services are secure.
“The idea that having a binary blob that gets inserted, that will remain dormant until a warrant is issued, is an identical threat to the threat we see if there wasn’t a warrant involved,” he said.
Instead, the government should exempt encrypted apps from content scanning.
The bill is expected to go through 10 or 12 days of committee hearings in the House of Lords before reaching a report stage and its final third reading by July 2023.
Comentarios recientes