iPhones to start scanning for images of child abuse

I’m pretty ignorant when it comes to the technology, but I’ve seen a couple of articles where the “experts” claim it is possible for people to do some damage intentionally. A guy who is a “Johns Hopkins Cryptographer” (whatever that means) was talking about the ability to, well, here is his quote:

" We’ve already noted that innocent images can sometimes match CSAM fingerprints by chance, but Green points to a paper pointing out that it’s entirely possibly to deliberately create images that will generate a matching hash.

Someone who wants to create trouble for an enemy could arrange for them to be sent innocent-looking materials – which may look absolutely nothing like a problematic photo – which match known CSAM fingerprints. That then opens up the target to all the risks just covered."

The following is makeup. Not a real injury.

Would this trigger the AI? If it does what about all of the people doing makeup like this or taking pics of Halloween makeup? Or am I completely missing the point. (I never rule this out.)

Yes. Apple is matching a specific dataset of images.

Or at least the part evreyone’s concerned about. Because now what’s in that dataset of images is an avenue for governmental enforced overreach (e.g. China et al).

I knew it. Thanks for the reply.

That isn’t necessarily what it means. They could mean photos hosted in iCloud would be “decrypted” (as Apple already holds the keys) and passed along to authorities.

This seems.pretty concerning to me. I know that in the past I’ve taken pictures of my kids diaper rash to show my wife or the doctor and sometimes my son will run out of the shower naked and start doing some kind of silly dance. Would I be accused of child abuse if i had recordings of that kind of stuff?

That is possible, if you send those pictures to other people, they post them publicly, and they get added to the hash database. But I mean, you’d be pretty pissed at your maiden aunt for putting your kid’s dinkle on the internet in the first place.

Oh I see, so it’s comparing information found in photos on the internet with photos from your phone? I guess i don’t need to be concerned then. That and I don’t have an iPhone.

Right, it’s computing a hash of icky files and comparing that against files stored on your non-existent iPhone.

My hope is that there is far more good in this. Unfortunately, I know the details of someone who recently was arrested on underage solicitation and child porn charges (and who is going to be in federal prison for a long time.) He was caught in a sting operation of the type that is going on non-stop most places. And he was getting the porn from some "standard’ places that such people think are safe. My guess is that they will take the specific photos from those sites and try to find people who have them on their systems. I don’t think they expect this to get everyone but if it only gets 50% or most of the unsophisticated, AND they are very careful to not screw up the lives of anyone for whom they may get a false positive, it will be a good thing.

That said, I do have a fundamental issue with people scanning/checking out stuff on my iPhone and I have nothing to hide.

I also expect whatever web sites these people go to to provide info to all of them on how to hide their photos on their phone in a way that won’t be scannable. But again, if you just catch the dumber ones, more power to them.

Nobody’s defending the pedos. That would be an amazing devil’s advocate position to take, though, if anyone’s interested.

Anyway, this is about overreach and the next step. In the UK, ISPs were obligated to add web filters for kiddie porn. Fine, right? Nobody defends the pedos. But then the slope got slippery, and now they have to filter out sites selling counterfeit watches.

The issue as I’ve heard it is more “OK, now there’s a backdoor to decrypt your supposedly end to end encrypted messages. Tell me why apple doesn’t give this to the FBI/CIA, etc when someone decides you’re a threat?”

[I will note the privacy advocate was claiming that this filter was being applied to iMessages, which I’m not clear is actually happening. If I need to, I’ll dig up the BBC world service clip that played around 9:45AM this morning (EST)]

There are two parts to this, as I understand it.

  1. Apple matches photos locally on your phone against a hash database of kiddie porn when they’re uploaded to iCloud. If they match and you exceed a threshold, they call the cops.

  2. On kids’ accounts, Apple ML running locally on their phone looks for explicit/nude pictures being sent to your kid via the Messages app and blurs them out.

#1 is problematic because it could be extended to other illegal content, for example gay porn in Saudi Arabia. #2 is really problematic only if it talks back to the mothership.

You’re correct. According to the wikipedia page on PhotoDNA:

It is used on Microsoft’s own services including Bing and OneDrive,[4] as well as by Google’s Gmail, Twitter,[5] Facebook,[6] Adobe Systems,[7] Reddit,[8] Discord[9] and the NCMEC,[10] to whom Microsoft donated the technology.

Right, all the cloud services do it. The real problem is that Apple runs it locally on your device. They say they only do that when you upload to iCloud, but once that box is opened, it’s open-- and every iPhone uploads photos to iCloud by default too.

Yeah, they all do it. But only Apple is the one riding the marketing high horse of YOUR PRIVACY (except in China). And now this.

Yeah, once worldwide authorities know that Apple can decrypt files on an iPhone at will, every single one will demand a back door and threaten to make them illegal if Apple doesn’t allow it.

They’ll also come at it at the database level - find a way to get a non-pornographic image into the hash database that would identify certain people/groups.

It’s just so easy to imagine the slippery slope here, which is why I’m so shocked that Apple is doing it.

Our government has already been doing that for years.

They’ve been demanding it, but Apple has said they actually can’t do it. So they’ve turned to third parties to varying success (more so recently, apparently).

Yes, but once the backdoor actually exists, they will lose the ability to say no.