So Apple decided that they'll take every one of your pictures and try to asses them to check if they contain nude children.

Imagine some random guy knocking on your door, stating they are from the photo album manufacturer and they would now need to check if any of your photo albums contain pictures of naked children. And if he considers anything suspicious he might need to report it to the police.

I'm sure you would enjoy such a visit every week, I mean, it's to protect children, right? Right?

@sheogorath that Mischaracterizes the tech in important ways. They are checking if your images match known bad images, random naked pics of your own children getting flagged would be a false positive.

Follow

@LovesTha

They use MD5 hashes of actual child abuse images, which is a 100% match, or PhotoDNA which is a fuzzy match but quite precise I guess. I would be much more concerned about "demands creep" so you start from child abuse and end reporting pirated movies or apps simply because the feature is already there.

@sheogorath

Sign in to participate in the conversation
Mastodon 🔐 privacytools.io

Fast, secure and up-to-date instance. PrivacyTools provides knowledge and tools to protect your privacy against global mass surveillance.

Website: privacytools.io
Matrix Chat: chat.privacytools.io
Support us on OpenCollective, many contributions are tax deductible!