In news
Apple has announced that software updates later this year will bring new features that will “help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material (CSAM)”.
Highlights
- Expected to go live in the United States initially, the features include use of new technology to limit the spread of CSAM online, especially via Apple platform.
- Then there will be on-device protection for children from sending or receiving sensitive content, with mechanisms to alert parents in case the user is below the age of 13.
- Apple explained that it will use cryptography applications via iOS and iPadOS to match known CSAM images stored on iCloud Photo.
Functioning
- The technology will match images on a user’s iCloud with known images provided by child safety organisations.
- And this is done without actually seeing the image and only by looking for what is like a fingerprint match.
- In case there are matches crossing a threshold, Apple will “report these instances to the National Center for Missing and Exploited Children (NCMEC)”.
Other features
- Apple’s new communication safety for Messages will blur a sensitive image and warn a child about the nature of the content.
- If enabled from the backend, the child could also be told that their parents have been alerted about the message they have viewed.
- The same will apply if the child decides to send a sensitive message.
- Apple said Messages will use “on-device machine learning to analyse image attachments and determine if a photo is sexually explicit” and that Apple will not get access to the messages.
Source: Indian Express