Apple is set to roll out a snooping feature that scans messages for nudity to UK iPhones. The feature uses AI technology to scan incoming and outgoing messages. For the moment it is otional and allows parents to turn on warnings for their
children's iPhones. When enabled, all photos sent or received by the child using the Messages app will be scanned for nudity. If nudity is found in photos received by a child with the setting turned on, the photo will be blurred, and the child
will be warned that it may contain sensitive content and nudged towards resources from child safety groups. If nudity is found in photos sent by a child, similar protections kick in, and the child is encouraged not to send the images, and given an option
to Message a Grown-Up. All the scanning is carried out on-device, meaning that the images are analysed by the iPhone itself, and Apple never sees either the photos being analysed or the results of the analysis, it said. As originally
announced in summer 2021, the communication safety in Messages and the search warnings were part of a trio of features that proved extremely contentious, and Apple delayed the launch of all three while it negotiated with privacy and child safety groups.
Of course having implemented the feature as an option it won't be long before it becomes an option that can be turned on by law enforcement in the name of seeking out terrorists, racists, anti-vaxers etc |