Apple Scans ALL Photos Uploaded by Users in iCloud

Apple Scans ALL Photos Uploaded by Users in iCloud

Apple Lossless Audio CODEC (ALAC), started to scan all the pictures uploaded by iPhone, iPad, iPod Touch, Mac users in iCloud, the announcement was made during CES 2020 by one of the managers of the American company.

Apple's decision to analyze all pictures uploaded to iCloud is based on the objective of discovering which of them have content that may suggest that minors are being exploited for sexual purposes.

Apple says that it uses artificial intelligence to detect pictures of this kind that are uploaded to iCloud, and those that are marked as having such content are also analyzed by its employees.

"Apple is dedicated to protecting children across the entire ecosystem wherever our products are used, and we continue to support innovation in this space. We have developed strong protections at all levels of our software platform and throughout the supply chain. As part of this commitment, Apple uses image matching technology to help find and report child exploitation. Like email spam filters, our systems use electronic signatures to find suspected child exploitation. We validate each match with individual review. Accounts with child exploitative content violate our terms and conditions of service and any accounts we find with this material will be disabled.”

Apple notifies the authorities when it discovers images that may suggest the fact that there are children who have been sexually exploited, and everything happens without the person uploading the pictures doing so.