Speaking of actual free speech issues, the EU is proposing a regulation that could mandate scanning of encrypted messages for CSAM material. This is Apple all over again.
This document is the most terrifying thing I’ve ever seen. It is proposing a new mass surveillance system that will read private text messages, not to detect CSAM, but to detect “grooming”. Read for yourself.
Let me be clear what that means: to detect “grooming” is not simply searching for known CSAM. It isn’t using AI to detect new CSAM, which is also on the table.
It’s running algorithms reading your actual text messages to figure out what you’re saying, at scale.
It is potentially going to do this on encrypted messages that should be private. It won’t be good, and it won’t be smart, and it will make mistakes.
But what’s terrifying is that once you open up “machines reading your text messages” for any purpose, there are no limits.
Here is the document. It is long but worth reading, because it describes the most sophisticated mass surveillance machinery ever deployed outside of China and the USSR. Not an exaggeration. https://alecmuffett.com/alecm/tmp/eu-csam-e2ee.pdf…
By legally mandating the construction of these surveillance systems in Europe, the European government will ultimately make these capabilities available to every government.
I am so tired. It seems like every time people rally to fight off one bad surveillance proposal, an even more powerful organization pops up to devise a more invasive one. Is there no constituency out there for “just let me be, and don’t spy on my private communications”?
Monitoring private conversations is “intrusive” but this is an acceptable balance because the algorithms do not “understand” the conversations.
Am I losing my mind, did actual thinking human beings write this text?
Dear everyone: there is no abuse-resistant technology that can accurately real human conversations “privately”. It does not exist.
Maybe in the year 2200 we’ll be able to outsource crime prevention to a benevolent AI: today it is science fiction.
“but the children” is the priority one narrative for surveillance all over the place. The data will be used for anything, once the legal interface is there.
The EU Commission has found a clever solution to the problem of demanding the impossible: leave it to service providers to devise a method for achieving the desired outcome, like detection of content in #E2EE communications.
"Saving the children" or whatever is just an excuse for public consumption; they really couldn't care less about child abuse. If they did, they'd, as you point out, do more with the information they already have. This is just a pretext for mass surveillance.
This EU legislature is creepy. The good news, it is impossible to implement in practice.
1. Despite the claims, language understanding is not there yet. For low resource languages we don't have the training data to create accurate models. (1/n)
"You're under arrest for being too nice to a teenage girl. Across multiple conversations and an extended period of time, you never insulted her or called her names ONCE"
Makes for an interesting discussion with board members. Does your encrypted data have nexus with X jurisdiction? If yes, the encryption we put in place to safeguard client and employee data are worthless.