Apple’s New Step to Protect Child Abuse via Encryption Feature
Apple’s New Step to Protect Child Abuse via Encryption Feature
Apple unveiled a new set of technical safeguards in its mobile features with the purpose to prevent child exploitation. A new opt-in feature in family iCloud accounts will detect illegal pictures using machine learning. The system may also prevent such pictures from being shared or received, provide warnings, and, in some circumstances, notify parents that their kid viewed or sent them. The feature will employ a cryptographic procedure that takes place partially on the device and partly on Apple’s servers.
But as per the critics, despite the privacy safeguards in place, Apple’s efforts to verify photos on a user’s device make sense only when the images are encrypted before they leave the user’s phone or computer, making server-side detection impossible. Furthermore, this implies that Apple will expand the detection technology to photographs on users’ devices that are never uploaded to iCloud—a type of on-device image scanning that would represent a new sort of privacy violation into users’ offline storage.
The said innovative feature is now only available in the United States, but if they were to spread to other countries, there would be a lot of concern about whether the other countries would be willing to accept it when there is already a legislative legislation protecting personal data protection in place.
https://www.wired.com/story/apple-csam-detection-icloud-photos-encryption-privacy/
For more info, please contact me at sashaatolia@privacad.com
Sasha Atolia
Privacy & Data Protection Expert with Privacy Academy India (PAI)
Privacy & Data Protection Consultant with Privacy Consultancy Services India (PCS)
Source: