Apple is about to announce a new feature.
They are going to start scanning everyone’s iPhone for banned content. Seriously.
It uses neural networks and machine learning, so I am sure it will be cool.
According to respected cryptography professor Matthew Green, it is going to scan everyone’s devices for child porn (now referred to as Child Sexual Abuse Material or CSAM).
Historically, the industry hashed known CSAM material and looked for exact matches. But if someone changes a single pixel, it no longer matches, hence the use of machine learning.
Apple, apparently, already scans users’ iCloud backups since Apple refused to encrypt them. They did that at the request of police. Who want to be able to easily search your backups.
This is just the next step, right?
And CSAM is bad (fair, it is).
I am sure that Russia or China. Or the United States. Will never ask Apple to search a phone for ANYTHING else.
And guess who the guinea pigs are?
United States users. Probably because the U.S. has no national privacy law, no national privacy rights. So they don’t have to deal with breaking those pesky laws.
Welcome to 1984. Only a little bit late.
Pre-crime comes next year, no doubt.
I am glad I am an Android user.
Credit: The Register