Skip to main content

A proposed European law that could require communications companies, including WhatsApp, Signal, and Facebook Messenger to scan the contents of private and encrypted messages for child abuse material are likely to be annulled by the European Court of Justice, according to the EU’s own internal legal advice.

The controversial EU law, known as ‘chat control’ will allow governments to serve “detection orders” on technology companies requiring them to scan private emails and messages on private communication services for ‘indicators of child abuse, in a move that critics say will undermine encrypted communications.

Technology companies have objected to similar proposals in the UK in the Online Safety Bill, and have warned that they would be forced to withdraw their services if regulators were given powers to require tech companies to place “back doors” into encrypted messaging services.

The European Commission proposed in May last year to introduce mandatory requirements for all email, chat, and messaging service providers, including those providing end-to-end encrypted communications, to scan messages for illegal child sexual abuse material (CSAM).

Permanent surviellance

But leaked internal legal advice from the Council of the European Union, has raised serious questions about the lawfulness  of the planned ‘chat control’ measures , which it says, could lead to the defacto “permanent surveillance of all interpersonal communications.”

The document, written by the legal service of the European Commission, and seen by Computer Weekly, points out that there is a high probability that detection orders aimed at users of phone, email, messenger and chat services would constitute “general and indiscriminate” surveillance in breach of EU privacy rights.

The Commission’s legal service states that the ‘chat control’ proposals imply that technology companies would either have to abandon effective end-to-end encryption, introduce some sort of “back-door” to access encrypted content, or access content before it is encrypted by installing client-side scanning technology on user’s phones and computers.

“It appears that the generalized screening of content of communications to detect any kind of CSAM would require de facto prohibiting, weakening or otherwise circumventing cybersecurity measures,” the lawyers write.

There is a serious risk that the proposals would compromise citizens rights to privacy and data protection under articles 7 and 8 of the European Charter of Fundamental Rights, by authorising the automated surveillance of all users of a specific messaging services, irrespective of whether they had any link with child sexual abuse, the document states.

The EU proposal requires tech companies to install “sufficiently reliable detection technologies,” but fails to explain what would count as “sufficiently reliable” or what error rates, such as messages wrongly identified as containing illegal content, would be acceptable.

The legal advice, dated 26 April 2023 found that according to the European Court, member states can only lawfully carry out bulk automated analysis of traffic and location data of communications services  to combat serious threats to national security.

“If the screening of communications metadata was judged by the Court proportionate only for the purpose of safeguarding national security, it is rather unlikely that similar screening of content of communications for the purpose of combating child abuse would be found proportionate,” the legal advice warns.

EU lawyers also warn that requirements for communications companies to introduce age verification systems “would necessarily add another layer of interference with the rights and freedoms of users”.

Age verification would have to be carried out by either mass profiling of users, biometric analysis of users’ face or voice or by the use of digital identification or certification systems.

Ten EU states back surveillance of end-to-end encryption

Despite the concerns raised by the Commission’s lawyers, ten EU countries – Belgium, Bulgaria, Cyprus, Hungary, Ireland, Italy, Latvia, Lithuania, Romania and Spain – argued in a joint position paper on 27 April 2023, that end-to-end encryption should not be excluded from the European Commission’s ‘chat control’ proposal.

MEP Patrick Breyer, a member of the European Parliament’s Committee on Civil Liberties, Justice and Home Affairs (Libe), called the EU presidency, currently held by Switzerland, to remove blanket monitoring of private communications and age verification from the prosed legislation.

“The EU Council’s services now confirm in crystal clear words what other legal experts, human rights defenders, law enforcement officials, abuse victims and child protection organisations have been warning about for a long time: obliging e-mail, messaging and chat providers to search all private messages for allegedly illegal material and report to the police, destroys and violates the right to confidentiality of correspondence,” he said.

“What children really need and want is a safe and empowering design of chat services as well as Europe-wide standards for effective prevention measures, victim support, counselling and criminal investigations,” he added.

Concern over UK encryption plans

Technology companies offering encrypted messaging services urged the UK government to make urgent changes to similar legislation going through the British Parliament in an open letter in April 2023.

WhatsApp, owned by Meta, said in a statement that the bill could force technology companies to break end-to-end encryption on private messaging services, affecting the privacy of billions of people.

The letter argued that end-to-end encryption offers one of the strongest possible defences against malicious actors and hostile states, along with persistent threats from online fraud, scams and data theft.

Separately the National Union of Journalists warned that the Online Safety Bill risks undermining the security of confidential communications between journalists and their sources.

Leave a Reply