iPhones to start scanning for images of child abuse

Look, child abuse is abhorrent and anyone engaging in it should have terrible things done to them, but I’m not super comfortable with an algorithm automatically searching the content of my iPhone for anything. This is perhaps one step too far over the Big Brother line for me.

So how do we opt out?

‘continuously’ scan’

Sounds like a great way to slow down phones.

I’m sure it will be perfect and nothing wrong will happen.

This is a real problem as once Pandora’s box is open and it’s possible, it will be used for more than kiddie diddlers and Al-shahab. You’ll find China looking for images of Xi as Winnie the Pooh, Russia watching the Chechnyans, etc. Maybe President Eric Trump will be checking out what Washington Post reporters have on their phones.

Is it even legal?

Yes, of course. You sign a TOS.

Yeah, because it’s a private company doing the scanning and not the government, but once the tech is there, you know police departments will be submitting thousands and thousands of images they’d like Apple to scan for as well, including pictures of drugs and/or drug paraphernalia, stolen property, etc.

I’m really shocked to see this, to be honest, given Apple’s history with security. I’m hoping the rumor is vastly misconstruing what they have planned.

They’re now saying it’s only when you upload to iCloud, but iPhones do that by default so it isn’t great. That said you can turn it off, so if you’re gay in Saudi Arabia or a dissident in Hungaria or whatever there’s a recourse.

Yeah, this update doesn’t make me feel much better about this. The slippery slope is neigh.

Update 8/5 4pm ET: Apple confirmed plans to start testing a system that would be able to detect images of child sexual abuse stored in iCloud Photos in the United States. “Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations,” the company wrote in a statement. “Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.”

The update will be rolling out at a later date, along with several other child safety features, including new parental controls that can detect explicit photosin children’s Messages.

That sounds like they are looking for a set of known porn images, not analyzing the content of photos to “recognize” the context of image content.

Correct, it just hashes the files.

Detecting in kids’ iMessages sounds like it looks at content though.

What is not discussed is what happens when it detects a match. Are they going to alert authorities? Seems like a risk of false positives might mean you get arrested and forced to unlock your phone so the cops can comb through it.

All we have to do to see that get abused is wait.

I don’t think that is one of the problems. As far as I recall how it works, with a long enough hash, it’s highly unlikely. Use multiple, it’s not going to happen before the heat death of the universe. Or a few universes.

They do discuss it, if you pass a certain threshold, which they say they tune at a false positive rate of one in a trillion, a human logs into your iCloud account and looks at the pictures, then refers to law enforcement.

It saves the hashes to our device? So every iOS update is going to come with this bloated database of bad images?

Apple charges $100 per 32gb of storage on your phone. It’s absurd

I’m just not a fan of the whole “if you’re not a crook you don’t have anything to worry about” mentality. There is way too much potential for unintended consequences. As soon as the world’s authorities realize that they can now search everyone’s images as long as they get a court order, shits going to get real.

I was under the impression, maybe wrongly, that everyone else is already doing this, Apple is just joining the party. Not true?

It’s just the hashes, the space usage is the least objectionable part of the whole thing.

Something about this doesn’t add up to me. If they are only searching images on iCloud, why do the hashes need to exist on your phone?

They actually compare the hashes on your phone, but only when uploading to iCloud. Or so they say.