I’ve had independent confirmation from multiple people that Apple is releasing a client-side tool for CSAM scanning tomorrow. This is a really bad idea.
スレッド
会話
返信先: さん
These tools will allow Apple to scan your iPhone photos for photos that match a specific perceptual hash, and report them to Apple servers if too many appear.
10
173
432
Initially I understand this will be used to perform client side scanning for cloud-stored photos. Eventually it could be a key ingredient in adding surveillance to encrypted messaging systems.
5
83
378
The ability to add scanning systems like this to E2E messaging systems has been a major “ask” by law enforcement the world over. Here’s an open letter signed by former AG William Barr and other western governments.
11
100
336
This sort of tool can be a boon for finding child pornography in people’s phones. But imagine what it could do in the hands of an authoritarian government?
7
63
275
The way Apple is doing this launch, they’re going to start with non-E2E photos that people have already shared with the cloud. So it doesn’t “hurt” anyone’s privacy.
But you have to ask why anyone would develop a system like this if scanning E2E photos wasn’t the goal.
3
30
214
But even if you believe Apple won’t allow these tools to be misused
there’s still a lot to be concerned about. These systems rely on a database of “problematic media hashes” that you, as a consumer, can’t review.
5
99
446
Hashes using a new and proprietary neural hashing algorithm Apple has developed, and gotten NCMEC to agree to use.
We don’t know much about this algorithm. What if someone can make collisions?
1
22
182
These images are from an investigation using much simpler hash function than the new one Apple’s developing. They show how machine learning can be used to find such collisions.
6
37
204
The idea that Apple is a “privacy” company has bought them a lot of good press. But it’s important to remember that this is the same company that won’t encrypt your iCloud backups because the FBI put pressure on them.
12
232
528
返信先: さん
I've been waiting for Confocal Scanning Acoustic Microscopy tools for iPhone for years!
2
Lol no
返信先: さん
Because people will reverse engineer it to figure out how to evade CSAM scanners?
1
1
It's a problem if you don't trust the people deciding what to put into the database to search for.
2
12
返信を表示
返信先: さん
This is interesting. This could also be used to find out who is spreading propaganda, leaks, gov/corp secrets, controversial memes. This also allows the ability to see the hierarchy of how things are shared. I could see this as a political weapon. All it takes is a court.
2
8
79
Any court.
2
16
返信を表示
返信先: さん, さん
1
1
What?
8
返信を表示
返信先: さん
tip: spell out your acronyms. even I as I technical person don’t know what your acronyms are
4
66
Child Sexual Abuse Media.
1
25
返信を表示
does jailbreaking really do anything to help this? im actually wondering, i have no idea about how this stuff works.
3
5
返信を表示
The rest of the thread explains some... potential towards IDing contents of encrypted files
1
3
返信を表示
返信先: さん, さん
They want the ability to destroy fact check information, CONQUER & CONTROL...
3
返信先: さん
Speaking
for myself here. Apple is pretty late to the party. Pretty much every other ESP (Electronic Service Provider) is already doing some version of this. You don’t get to store your Child Sex Abuse Material / Media on remote storage & throw a fit about privacy. Sorry.
2
1
I think you might have misinterpreted what was being said here.
1
12
返信を表示
返信先: さん, さん
What better methods do you suggest for scanning for CSAM?
4
None? How about we stop rooting through people’s files without well reasoned court orders in the name of crime solving?
3
69
返信を表示
Hello, please find the unroll here: I’ve had independent confirmation from multiple people that Apple is… threadreaderapp.com/thread/1423071 See you soon. 
1
3
返信先: さん, さん
From a technical POV, such a system will probably be easy to beat (false negatives) and spoof (false positives) at least for a while, maybe a long time. ANNs are very susceptible to adversarial strategies and it’s hard to imagine that fundamentally changing in current paradigms.
1
13
So it’s probably ineffective against actual evil actors because they’re worried about being caught. The rest of though, with our good old fashioned surveillable data without adversarial filters will be the losers as usual.
1
1
12
返信を表示
返信先: さん
how do i opt out, ? ever so slowly moving towards having a linux phone running or something similar. need my on bare metal...
返信先: さん
This deserves much amplification. I suspect 90% of iPhone users don't understand the gravity of this (the 10% probably on the tweets)
1
7
I think people have decided everything is already being seen and have somewhat decided not to look further into how specifically that is the case. The iPhone is majorly convenient.
返信先: さん
Possibly dumb question... But don't they already have this (or something similar capability wise) for cloud sync? Hash the file, check to see if it's indexed and if not sync it to iCloud. They also would probably use something similar for iTunes for song matching etc?
1
It's similar but the design goals of a CSAM detector are: 1-way; hashes only identify CSAM; indifferent to filters, crops, rotations, or byte changes; and ideally the system can't be used against itself, e.g., like malware crypters bypass AV.
返信先: さん, さん
To be fair, 100% of what Apple release is a bad idea... The last good idea was the iphone 1... and the original macbook pro in 2005..since then all garbage all the time, no innovation, just the same warm dog ***t...
1
返信をさらに表示
Twitterを使ってみよう
今すぐ登録して、タイムラインをカスタマイズしましょう。
Googleアカウントで登録
電話番号またはメールアドレスで登録
トレンド
いまどうしてる?
Tokyo 2020
ライブ
バレーボール:男子3位決定戦
オリンピック · トレンド
東京5000人
34,447件のツイート
マイナビニュース
2021年8月4日
【ひっ】深夜、部屋の端からこちらを見つめる5本脚の生き物……飼い主を心底驚かせた黒猫のポージングが話題に
FNNプライムオンライン
昨日
家庭で余った食品をコンビニで無償提供 困窮する学生や母子家庭を対象に…フードロス解消も