Apple Postpones Controversial Scan Function for Child Porn

Apple Postpones Controversial Scan Function for Child Porn

Apple postpones announced a method to detect child pornography. The company already wanted to scan on their iPhone or iPad for its American customers who want to store their photos in the cloud service iCloud of the tech company, whether there were any child porn images.

 

Although Apple claimed that all kinds of security measures were in place to protect users’ privacy, the plans were met with much criticism.

Apple now says that after feedback from customers, rights organizations and researchers, the company will take more time to improve the software. It is not clear exactly how long that will take, but Apple says it will be working on the technology “in the coming months”. More than 90 organizations worldwide asked Apple to suspend the plans.

Critics argued that the systems could also be abused by other countries that require, for example, images that are scanned against an oppressive government to track down dissidents. However, apple’s defence did not go further than that the company would oppose it.

The criticism was also so fierce because Apple has been proud of its initiatives to protect user privacy for several years. For example, the company advertised with the text ‘What happens on your iPhone stays on your iPhone’.

Apple also wanted to scan children’s text and chat messages for inappropriate material in the software update. A notification would then be sent to the parents.

Related Post