Apple is releasing the second developer beta of iOS 15.2, and it includes a notable new feature. Apple says that iOS 15.2 beta 2 adds support for the new communication safety feature in the Messages app. This is one of the features Apple first announced in August but ultimately delayed after pushback.
First and foremost, it’s important to remember that Apple’s original August announcement included three different features: CSAM detection for iCloud Photos, updates to Siri and Search to better handle unsafe situations, and communication safety in Messages.
Today’s release of iOS 15.2 beta 2 includes only the new communication safety feature in Messages. It does not include CSAM detection and it does not include the Siri and Search Guidance. There’s no timetable for when the controversial CSAM detection feature will launch, but Apple says the Siri and Search updates are coming sometime later this year.
The basic fundamentals of the communication feature are that the Messages app will warn children when receiving or sending photos that contain nudity, though the feature is not enabled by default. Instead, a parent or guardian can opt their child into the feature through Family Sharing.
When a child receives an image in the Messages app that contains nudity, the photo will be blurred. The child will then be warned and presented with resources that can help them, but they have the option to go ahead and view the image. Similar protections are also in place for when a child tries to send an image containing nudity.
Compared to the version of this feature that was announced in August, Apple has made some changes in response to concerns and criticism. Initially, Apple had implemented the feature such that parents of children under the age of 13 could be notified if the child viewed a nude image in Messages.
After announcing the feature, Apple says that it heard feedback that this notification system could put the child at risk in certain situations. As such, the implementation of the feature in iOS 15.2 beta 2 focuses on giving the child more control. Apple now says that children of any age are now given the option to message someone they trust for help if they’d like to, but this decision is now completely separate from their decision to view the image.
From a privacy and security standpoint, it’s important to note that Messages analyzes image attachments to check if a photo contains nudity. The end-to-end encryption of the messages is maintained and no indication of the nudity leaves the user’s device.
We’re digging into today’s release of iOS 15.2 and will update this story as we learn more.
FTC: We use income earning auto affiliate links. More.
Check out 9to5Mac on YouTube for more Apple news: