Apple has confirmed that it will launch an image scan software that will analyze photos stored in a user's iCloud Photos account to detect images of child sexual abuse and report any relevant findings to authorities.

The firm also announced a feature that will analyze photos sent and received in the Messages app to or from children to check if they are explicit, Aljazeera reported.

Once Apple detects sexually explicit photos of children in a user's account, the company will manually review them. The incident will then be reported to the National Center for Missing and Exploited Children, working with law enforcement agencies.

The tech giant noted that users' images would be analyzed on their iPhones and iPads before being uploaded to the backup cloud service.

READ NEXT: iOS 15 Public Beta: What's New and Here's How You Can Download the Latest Apple Software

Apple to Scan Devices For Images of Child Sex Abuse

Cloud services like Google, Dropbox, and Microsoft have already been practicing the new introduction from Apple. But Apple has resisted scanning users' files in the cloud, Tech Crunch reported.

Apple said they are giving the users the option to encrypt their data before it ever reaches their iCloud servers. Apple's new technology has an extremely high level of accuracy. 

The company ensured that there is one in a trillion probability of falsely reporting an account they had scanned, according to an Arstechnica report.

The Cupertino, California-based technology giant previewed the three new features on Thursday, August 5, and said the changes would roll out in iOS 15, watchOS 8, and macOS Moneterey updates later this year.

The Message app in Apple devices would also be updated, with new tools to help warn children and parents in having received sexually exploitative content.

The company noted that the feature would not give Apple access to the messages. Instead, the Messages app will be using on-device machine learning to analyze image attachments.

Siri and Spotlight Search

Apple is also adding features in its Siri digital voice assistant to intervene when users search for related abusive material. 

Users can also ask Siri how they can report child sexual abuse material or child abuse, and they will be pointed to resources for where and how to file a report, Mac Rumors reported.

John Clark said that Apple's expanded safeguards for children were a game changer. Clark is the President and CEO of the National Center for Missing & Exploited Children.

Clark noted that privacy and child protection could co-exist, lauding the tech mogul's efforts on fighting child sex abuse.

However, the new changes in the system also garnered scrutiny from critics. Greg Nojeim, co-director of the Center for Democracy & Technology's Security & Surveillance Project, said the company is replacing its industry-standard-end-to-end encrypted messaging system.

Nojeim noted that the firm should leave the said changes and restore its users' faith in the security and integrity of their data on Apple devices and services.

READ MORE: Lawmakers Say Amazon Exercised Monopoly Power Over Third-Party Sellers 

This article is owned by Latin Post

Written by: Mary Webber

WATCH: Apple to Scan U.S. iPhones for Images of Child Abuse - From KTVU FOX 2 San Francisco