So Apple decided that they'll take every one of your pictures and try to asses them to check if they contain nude children.

Imagine some random guy knocking on your door, stating they are from the photo album manufacturer and they would now need to check if any of your photo albums contain pictures of naked children. And if he considers anything suspicious he might need to report it to the police.

I'm sure you would enjoy such a visit every week, I mean, it's to protect children, right? Right?

Oops I did it again…

A little warning banner showing up, reminding you what Apple decided they should do. I guess that will stick around a bit longer than the hype around the problem.

git.shivering-isles.com/shiver

Enjoy!

Follow

Well, that didn't take long… we have a first hash collision: social.wildeboer.net/@jwildebo

Have that picture of a dog on your phone and the secret threshold for apple's CSAM scanner is down by one.

Good that there is only a 1 in a trillion chance of a false positive.

But no worries, it's just your own phone that make wrong accusations of being a pedophile against you and maybe reports you to Apple which might takes it to the police.

@sheogorath Many argue that if it's a piece of software then it is not search without warrant, because a search needs to be carried out by a human. But search by software may be (sometimes) even worse than search by human, exactly because of these cases (and if people blindly believe in results from such software, then it's just recipe for dystopia).

@sheogorath
would it be bad to send this to friends group on WhatsApp? What's the real risk other than make people angry against Apple?

@jibec it depends, I'm not familiar enough with WhatsApp and iPhones to make a statement whether this is risky. It probably depends on whether pictures are automatically uploaded to the icloud.

Sign in to participate in the conversation
Sheogorath's Microblog

This is my personal microblog. It's filled with my fun, joy and silliness.