Apple yesterday announced new child-safety features coming to iOS 15, iPadOS 15, and macOS Monterey. The new features allow Apple to scan photos in the iCloud Library and iMessages for sensitive content and potential child abuse. Despite Apple’s claims that all the photo-scanning happens on the device, the feature has raised privacy concerns among quite a few security researchers worldwide.

But what is iCloud photo scanning? In short, Apple will now be able to scan photos on your iPhone, iPad, and Mac before they’re uploaded to iCloud. Before the upload begins, the operating system will check the photos for potential Child Sexual Abuse Material (CSAM). Apple says the scanning happens on the device and through a new technique called NeuralHash. If anything phishy is found, Apple ‘may’ report the case to the National Center for Missing and Exploited Children (NCMEC).

Even though the photo scanning feature is being used for good purposes, it’s raised eyebrows of a lot of security researchers around the world. Edward Snowden even went onto say that the feature will turn everybody’s iPhone into “iNarcs.” “If they can scan for kiddie porn today, they can scan for anything tomorrow,” said Snowden in a tweet on Thursday.

Ross Anderson, professor of security engineering at the University of Cambridge, told the Financial Times, “It is an absolutely appalling idea because it is going to lead to distributed bulk surveillance of . . . our phones and laptops,” says the report. Anderson even says that this is a “huge and regressive step for individual privacy.”

Matthew Green, who was the first one to report that Apple is designing such a system, said, “why would Apple spend so much time and effort designing a system that is specifically designed to scan images that exist (in plaintext) only on your phone — if they didn’t eventually plan to use it for data that you don’t share in plaintext with Apple?”

Even the EFF called the photo scanning feature a “shocking about-face” move from Apple.

That’s a fully built system just waiting for external pressure to make the slightest change.

“We’ve said it before, and we’ll say it again now: It’s impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children. As a consequence, even a well-intentioned effort to build such a system will break key promises of the messenger’s encryption itself and open the door to broader abuses,” said the EFF in a statement.

Apple has just previewed the controversial feature, for now. The latest versions of iOS 15, iPadOS 15, and macOS Monterey do not scan your iCloud photos. Moreover, there are ways of opting-out, by not uploading the photos to iCloud or disabling iCloud Photos altogether. However, following the backlash, the Cupertino-based giant may make changes before it is rolled out.

What are your thoughts on iCloud photo scanning? Do you think this feature should be optional, or do you believe Apple should be able to scan photos for CSAM? Let us know your opinion in the comments section down below!