Follow

So Apple decided that they'll take every one of your pictures and try to asses them to check if they contain nude children.

Imagine some random guy knocking on your door, stating they are from the photo album manufacturer and they would now need to check if any of your photo albums contain pictures of naked children. And if he considers anything suspicious he might need to report it to the police.

I'm sure you would enjoy such a visit every week, I mean, it's to protect children, right? Right?

Oops I did it again…

A little warning banner showing up, reminding you what Apple decided they should do. I guess that will stick around a bit longer than the hype around the problem.

git.shivering-isles.com/shiver

Enjoy!

Well, that didn't take long… we have a first hash collision: social.wildeboer.net/@jwildebo

Have that picture of a dog on your phone and the secret threshold for apple's CSAM scanner is down by one.

Good that there is only a 1 in a trillion chance of a false positive.

But no worries, it's just your own phone that make wrong accusations of being a pedophile against you and maybe reports you to Apple which might takes it to the police.

@sheogorath I'm so beyond caring at this point that if they said that they were reading all my emails and replying for me, I would probably just shrug.

Yeah, it's not great. Yeah, it seems like Apple is under pressure to cooperate with law enforcement, or at least make some sort of overture that makes it look like they're doing something.

Listen: I don't care anymore. They're going to do this with or without my consent, no matter how many soccer moms yell at their overworked interns.

@sheogorath Not everyone can be an island of self-hosted nirvana. Being a sysadmin is hard, stupid and overkill. Depending on others is not a sin, nor a failing. Every day we get new news about some company doing some stupid bullshit.

Are we expected to jump ship? Move our hundreds of gigabytes of data to another datacenter down the hall?

I mean, what do we do? Is it worth it? Why bother?

@nathand the origin of this problem is not technical. Neither is the solution.

I try to explain what Apple does here in simple words. And raise awareness about this in society. We can either try to solve this in a technical manner and be the 5 people who no one wants to talk to but that are safe, or we can address this as a social manner and form protests against it. Maybe with the EFF, maybe with your local party. This is how we might can stop it. Or fail trying.

@sheogorath that Mischaracterizes the tech in important ways. They are checking if your images match known bad images, random naked pics of your own children getting flagged would be a false positive.

@LovesTha which is technically a fun thing, but when it happens become quite unfunny.

Depending on how things go, you either have apple staff going though your pictures or even worse you might have cops showing up asking you and maybe your neighbours strange questions.

@LovesTha

They use MD5 hashes of actual child abuse images, which is a 100% match, or PhotoDNA which is a fuzzy match but quite precise I guess. I would be much more concerned about "demands creep" so you start from child abuse and end reporting pirated movies or apps simply because the feature is already there.

@sheogorath

@sheogorath
Niiice!

Btw we got a Cloudflare warning when we boosted your site. Could be DNS-related.

@dsfgs Yes, Cloudflare is my DNS provider of choice at the moment.

And yes, I'm fully aware of all the (potential) problems Cloudflare causes.

@sheogorath
Its a very nice little website and message so we shared it.

We do recognise DNS is not quite as bad as the whole site being siphoned off through it.

@sheogorath Many argue that if it's a piece of software then it is not search without warrant, because a search needs to be carried out by a human. But search by software may be (sometimes) even worse than search by human, exactly because of these cases (and if people blindly believe in results from such software, then it's just recipe for dystopia).

@sheogorath
would it be bad to send this to friends group on WhatsApp? What's the real risk other than make people angry against Apple?

@jibec it depends, I'm not familiar enough with WhatsApp and iPhones to make a statement whether this is risky. It probably depends on whether pictures are automatically uploaded to the icloud.

Sign in to participate in the conversation
Sheogorath's Microblog

This is my personal microblog. It's filled with my fun, joy and silliness.