Apple to scan customer’s devices for inappropriate pictures.

Apple has apparently announced that they will soon scan customer’s devices for photos depicting child abuse.

This is starting to have people alarmed. Especially those concerned about unlawful search and seizure. Although the constitutional protections are for limits on government abuse, does it still apply since they say their algorithms will scan photos on devices and questionable content will be sent for human review and then law enforcement if deemed necessary?

As part of its Expanded Protections for Children, Apple plans to scan images on iPhones and other devices before they are uploaded to iCloud. If it finds an image that matches one in the database of the National Center for Missing and Exploited Children (NCMEC), a human at Apple will review the image to confirm whether it contains child pornography. If it’s confirmed, NCMEC will be notified and the user’s account will be disabled.

The announcement raised concerns among privacy advocates who questioned how Apple could prevent the system from being exploited by bad actors. The Electronic Frontier Foundation said in a statement that “it’s impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children” and that the system, however well-intended, “will break key promises of the messenger’s encryption itself and open the door to broader abuses.”

But this does raise questions. Apparently they are creating a backdoor for this process to work. How secure will this back door be? Will this allow bad actors to find a way to hack into your services and your private information. Will this be something that intelligence agencies around the world will use to exploit and gain access to accounts.

Now most people are not “important” enough to be a target of government agencies but take countries around the world with terrible records on human rights, etc. Can you imagine these bad actors licking their chops at ways to infiltrate devices to gain information on people they do not like and use that against them? Oh wait. They’ve probably already caved with the Chinese Communist Party and allows all information to be passed along to them. Maybe we will find out this is the new Communist style changes they demand for Apple to continue operating in China.

How do we know that Apple will not use this for other purposes? What if it was decided that customer’s devices needed to be scanned to see if they have certain political views, etc? What if they expand to cover any kind of photograph? How many couples send NSFW photos to each other and expect such items to remain private between them?

Fighting to protect minors from harm is a good thing and I am all for it. However, for a company that cries about how important customer’s privacy is and how they intend to defend it, it stops and gives pause and makes you wonder.

Why have these scanners on the device? I guess they want to capture anyone not using iCloud backup. I could see server side programming to capture details as it’s been uploaded to iCloud.

In a world in which all sense of privacy has been given away, I can’t help but feel we are heading in the direction of China and their surveillance state.

Apple needs to be transparent in how they are doing this and prove to its users that they are doing only what they say they are doing and nothing else. Else, Apple is no different from Google harvesting your data for advertising and there is really no reason to stay exclusive Apple.

But hey! What do I know? The present generation will say I am worrying about nothing. Since they have no sense of history and what’s taken place in the past, I won’t worry about what they think.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.