Later this year, Apple plans to release new software updates,
and the company just announced a preview of new child safety features that will
be part of the updates.
The preview includes a Communication Safety feature, Photo
Scanning for Child Sexual Abuse Material, and an Expanded Child Sexual Abuse
Material Guidance in Siri and Search.
With the new Communication Safety feature, children and parents
will be warned at the time of receiving or sending sexually explicit photos. Such
photos will be identified through on-device machine learning, which will also
blur the images. For instance, if a child receives a sexually explicit photo in
the Messages app, an alert will notify them that the photo may contain private
body parts and may be hurtful. This feature will be part of iOS 15, iPadOS 15
and macOS Monterey for accounts set up as families in iCloud, according to
Apple.
The Photo Scanning for Child Sexual Abuse Material is Apple’s
new technology that is designed to detect Child Sexual Abuse Material images
stored in iCloud photos. The technology will help report these images to the
National Center for Missing and Exploited Children that works in partnership
with US law enforcement agencies. Apple said that it will further use the hashing
technology, NeuralHash, to convert the detected images to unique numbers
specific to the images, so as to protect the privacy of users.
Finally, Apple will expand the Child Sexual Abuse Material Guidance
to Siri and Spotlight Search in the form of additional helpful resources for
the safety of children. For instance, if a user asks Siri how to report child
exploitation, Siri will provide resources for where and how to file a report. This
feature will be part of iOS 15, iPadOS 15, watchOS 8, and macOS Monterey, as
confirmed by Apple.
Apple’s new Child Safety features will be rolled out to
users in the US initially and will eventually be available in other countries.