Apple plans to introduce several image analysis functions on the iPhone in the near future. They will help in the fight against child pornography.

<!–[if IE 9]><!–[if IE 9]>© Les Numeriques

Do the ends justify the means? Thorny question in the field of technology and cybersecurity, and Apple’s latest announcement in this regard has not gone unnoticed. The Cupertino-based firm is taking steps to protect children, and that includes tools deployed on our smartphones. ” We want to help protect children from predators who use communication tools to recruit and exploit them, and limit the distribution of child pornography,” the company writes on its website.

Apple will offer new communication tools that will allow parents to “play a more informed role in helping their children navigate”. For example, the Messages app will use machine learning on the device to warn of sensitive content, while keeping private communications unreadable by Apple. Images will then be blurred and minor users redirected to prevention material. SiRi and Search will also provide parents and children with more information and help if they encounter dangerous situations. As for the third proposal, this is the one that is being debated in the cybersecurity community.

An analysis performed directly on the smartphone

Indeed, Apple plans to install software on iPhones to review content uploaded to iCloud, in order to search for pedophile images. ” Apple’s method of detecting these images is designed with users’ privacy in mind,” Apple said. Instead of scanning images in the cloud, the system performs an on-device match using a database (images) provided by NCMEC (National Center for Missing and Exploited Children) and other child safety organizations.”

Simply put, there’s no question of alerting the police because a user keeps pictures of their kids at the pool on their phone. In fact, the technology allows for “fingerprinting” of images, which will be compared to a database that includes “fingerprints” of illicit images. In case of a match, the system would notify a team of human analysts who can themselves send a report to NCMEC if needed. Apple assures that these processes are based on reliable encryption technologies that guarantee the safety of users. The database containing the encrypted fingerprints of all child pornography images provided by NCMEC will be directly integrated into iOS 15.

See also  The Fast Slow GO: Sage bets on simplicity to compete with the Moulinex Cookeo

For the time being, this update to iOS and iPadOS will only take place in the United States, where thentation is more permissive than in Europe on hovering over user data. These features will arrive later this year in updates to iOS 15, iPadOS 15, watchOS 8 and macOS Monterey.