“Do the impossible, you get to decide how” —

“War upon end-to-end encryption”: EU wants Big Tech to scan private messages

Services may have to scan encrypted messages for child abuse images and grooming.

Illustration of an eye on a digital background.
Getty Images | Yuichiro Chino

A European Commission proposal could force tech companies to scan private messages for child sexual abuse material (CSAM) and evidence of grooming, even when those messages are supposed to be protected by end-to-end encryption.

Online services that receive "detection orders" under the pending European Union legislation would have "obligations concerning the detection, reporting, removal and blocking of known and new child sexual abuse material, as well as solicitation of children, regardless of the technology used in the online exchanges," the proposal says. The plan calls end-to-end encryption an important security tool but essentially orders companies to break that end-to-end encryption by whatever technological means necessary:

In order to ensure the effectiveness of those measures, allow for tailored solutions, remain technologically neutral, and avoid circumvention of the detection obligations, those measures should be taken regardless of the technologies used by the providers concerned in connection to the provision of their services. Therefore, this Regulation leaves to the provider concerned the choice of the technologies to be operated to comply effectively with detection orders and should not be understood as incentivising or disincentivising the use of any given technology, provided that the technologies and accompanying measures meet the requirements of this Regulation.

That includes the use of end-to-end encryption technology, which is an important tool to guarantee the security and confidentiality of the communications of users, including those of children. When executing the detection order, providers should take all available safeguard measures to ensure that the technologies employed by them cannot be used by them or their employees for purposes other than compliance with this Regulation, nor by third parties, and thus to avoid undermining the security and confidentiality of the communications of users.

A questions-and-answers document describing the plan emphasizes the importance of scanning end-to-end encrypted messages. "NCMEC [National Center for Missing and Exploited Children] estimates that more than half of its CyberTipline reports will vanish with end-to-end encryption, leaving abuse undetected, unless providers take measures to protect children and their privacy also on end-to-end encrypted services," it says.

“Do the impossible, you get to decide how”

"It really looks like the European Commission wants to cancel encryption," said a post by Bits of Freedom, a Dutch digital rights foundation. The proposal "will force companies to monitor what people share with each other via chat apps like WhatsApp and platforms like Instagram," Bits of Freedom policy adviser Rejo Zenger wrote. "If deemed necessary, platforms will be forced to delete information or report it to the authorities. Internet service providers can also be ordered to monitor their customers' Internet traffic. But the Commission omits, quite cleverly, depending on where you're standing, just how they should do so. Effectively [the] message for companies is: 'Do the impossible, you get to decide how.'"

An EC announcement said the problem of CSAM has gotten out of hand and that the current "voluntary" system isn't enough. "With 85 million pictures and videos depicting child sexual abuse reported worldwide in 2021 alone, and many more going unreported, child sexual abuse is pervasive," the announcement said. "The COVID-19 pandemic has exacerbated the issue, with the Internet Watch foundation noting a 64 percent increase in reports of confirmed child sexual abuse in 2021 compared to the previous year. The current system based on voluntary detection and reporting by companies has proven to be insufficient to adequately protect children."

The proposal's detection orders would be "issued by courts or independent national authorities," the announcement said. A detection order would be "limited in time, targeting a specific type of content on a specific service," and instruct the company receiving the order to scan "for known or new child sexual abuse material or grooming." Grooming means "solicitation of children," the announcement said.

Other parts of the proposal "require app stores to ensure that children cannot download apps that may expose them to a high risk of solicitation of children." Additionally, "providers that have detected online child sexual abuse will have to report it to the EU Centre," and "national authorities can issue removal orders if the child sexual abuse material is not swiftly taken down. Internet access providers will also be required to disable access to images and videos that cannot be taken down, e.g., because they are hosted outside the EU in non-cooperative jurisdictions."

“War upon end-to-end encryption”

Scanning the content of private messages shouldn't be possible with encryption that is truly end to end. As Proton Mail explains, "E2EE [end-to-end encryption] eliminates this possibility because the service provider does not actually possess the decryption key. Because of this, E2EE is much stronger than standard encryption."

The European proposal was criticized by security experts including Alec Muffett, a network security researcher who—among other things—led the team that added end-to-end encryption to Facebook Messenger. "In case you missed it, today is the day that the European Union declares war upon end-to-end encryption, and demands access to every person's private messages on any platform in the name of protecting children," Muffett wrote.

In 2018, Facebook explained "that end-to-end encryption is used in all WhatsApp conversations and can be opted into in Messenger. End-to-end encrypted messages are secured with a lock, and only the sender and recipient have the special key needed to unlock and read them. For added protection, every message you send has its own unique lock and key. No one can intercept the communications."

Ars Video

Modern Vintage Gamer Reacts To His Top 1000 Comments On YouTube

jump to endpage 1 of 2

“This is Apple all over again”

The EU proposal is similar to previous calls for encryption "backdoors" made by US government officials. While Apple CEO Tim Cook opposed government-mandated backdoors, his company drew a backlash from security experts last year when it unveiled a plan to have iPhones and other devices scan user photos for child sexual-abuse images. Apple put the plan on hold and promised to make changes to address criticisms.

"This is Apple all over again," Johns Hopkins University cryptography professor Matthew Green wrote of the new EU proposal. In a tweet, Green called the plan "the most terrifying thing I've ever seen. It is proposing a new mass surveillance system that will read private text messages, not to detect CSAM, but to detect 'grooming.'"

Green was referring to part of the proposal that said, "detecting 'grooming' would have a positive impact on the fundamental rights of potential victims especially by contributing to the prevention of abuse," and that "indicators of 'grooming' are becoming ever more reliable with time, as the algorithms learn."

"This is going to the EU parliament shortly and will fundamentally change every aspect of the balance between citizen privacy and law enforcement. We seem to be sleepwalking into it," Green wrote.

The “least privacy-intrusive” way

The EC's announcement said that companies will be instructed to implement the proposed detection orders in the "least privacy-intrusive" way. "Companies having received a detection order will only be able to detect content using indicators of child sexual abuse verified and provided by the EU Centre," it said. "Detection technologies must only be used for the purpose of detecting child sexual abuse. Providers will have to deploy technologies that are the least privacy-intrusive in accordance with the state of the art in the industry, and that limit the error rate of false positives to the maximum extent possible."

However, Bits of Freedom wrote that it is "simply impossible to filter someone's Internet connection the way the European Commission wants." The group explained further with an example involving WhatsApp:

To give an example: based on this proposal an instant messaging platform can be given the task to detect material of the sexual exploitation of children. That could be known material, or "new" material, or grooming, so text. Let's assume, for the sake of the argument, that the order is given to Meta with regards to WhatsApp. A platform that, as you know, is protected with end-to-end encryption. This type of encryption means Meta can see who is communicating with whom, but is unable to read the content of that communication. But how is Meta supposed to detect something in a conversation it's not supposed to be able to access? For the sake of convenience, the Commission leaves that decision (the "how to do it") to the platform. Our guess is that the only way to do it, is by installing some sort of (now, government-mandated!) spyware on the phones of the people using a particular service. After all, that is the only place where the content of the chats is readable.

The news site Euractiv quoted Moritz Körner, a German member of the European Parliament, as saying the plan is "nothing short of a 'Stasi 2.0.'"

"Instead of fighting these heinous crimes by disproportionately giving up the basic rights of all EU citizens, it would be better to invest significantly more in the equipment of the police, the European police authority Europol and in the cross-border cooperation of the relevant authorities," Körner said.

You must to comment.