German lawmakers to Tim Cook: Apple should reconsider its CSAM plans

Apple introduced new child-safety features earlier this month. The new CSAM detection system allows Apple to scan photos in the iCloud Library and iMessages for sensitive content and potential child abuse. But, despite the good use, the new CSAM features have drawn a lot of backlashes.
Not only from security experts and Apple device owners, but even Apple’s own employees also are not happy with the CSAM. Since its introduction, Apple has published an FAQ in a bid to clear the air. The company’s senior VP, Craig Federighi, said in an interview that the CSAM features were “misunderstood.”
According to a report published by Heise Online, the Digital Agenda committee chief of the German parliament, Manuel Hoferlin, has written a letter to Tim Cook and wants Apple to reconsider its CSAM plans. He thinks that Apple is going on a “dangerous path” while undermining “safe and confidential communication.”
Apple must refrain from implementing the function – both to avoid “foreseeable problems” for the company and to protect the modern information society.
Although CSAM detection will only be implemented in the United States and will be on-device according to Apple, the German still thinks that CSAM will be “the largest surveillance instrument of history” and could make the Cupertino giant lose access to large markets in case the company keeps with this strategy.
Apple has just previewed CSAM detection, for now. It will be available in future versions of iOS, iPadOS, and macOS, and will detect CSAM on-device only, according to Apple. Besides, the company says it doesn’t scan the picture directly but does it with the help of hashes. Furthermore, there are ways of opting-out, by not uploading the photos to iCloud or disabling iCloud Photos altogether.
But, given so much pressure, the Cupertino giant may make changes before it is rolled out, or withdraw it altogether now that the German government has written to it.
What are your thoughts on iCloud photo scanning? Do you think CSAM detection should be optional, or do you believe Apple should be able to scan photos for child safety? Let us know your opinion in the comments section down below!