So Apple decided that they'll take every one of your pictures and try to asses them to check if they contain nude children.

Imagine some random guy knocking on your door, stating they are from the photo album manufacturer and they would now need to check if any of your photo albums contain pictures of naked children. And if he considers anything suspicious he might need to report it to the police.

I'm sure you would enjoy such a visit every week, I mean, it's to protect children, right? Right?

Oops I did it again…

A little warning banner showing up, reminding you what Apple decided they should do. I guess that will stick around a bit longer than the hype around the problem.

git.shivering-isles.com/shiver

Enjoy!

Well, that didn't take long… we have a first hash collision: social.wildeboer.net/@jwildebo

Have that picture of a dog on your phone and the secret threshold for apple's CSAM scanner is down by one.

Good that there is only a 1 in a trillion chance of a false positive.

But no worries, it's just your own phone that make wrong accusations of being a pedophile against you and maybe reports you to Apple which might takes it to the police.

Follow

@sheogorath
would it be bad to send this to friends group on WhatsApp? What's the real risk other than make people angry against Apple?

@jibec it depends, I'm not familiar enough with WhatsApp and iPhones to make a statement whether this is risky. It probably depends on whether pictures are automatically uploaded to the icloud.

Sign in to participate in the conversation
La Quadrature du Net - Mastodon - Media Fédéré

Mamot.fr est une serveur Mastodon francophone, géré par La Quadrature du Net.