Advertisement
Advertisement
Advertisement
Advertisement
Apple to Scan Photos on iPhones and iCloud for Child Abuse Imagery

Apple to Scan Photos on iPhones and iCloud for Child Abuse Imagery

Apple to Scan Photos on iPhones and iCloud for Child Abuse Imagery

Apple to Scan Photos on iPhones and iCloud for Child Abuse Imagery

Advertisement

Apple will scan photos stored in iPhones and iCloud for child abuse imagery, according to Financial Times.

This new system could help in enforcing laws in criminal investigations. But it may also increase the legal and government demands for user data.

The system is named neuralMatch. It will “proactively alert a team of human reviewers if it believes illegal imagery is detected, who would then contact law enforcement if the material can be verified,” the Financial Times said.

neuralMatch has been tested using more than 200,000 images from the National Centre for Missing and Exploited Children. it will roll out in the US at first. The photos uploaded will be compared to the ones in the database.

The New System to be Used in the US First

Advertisement

“According to people briefed on the plans, every photo uploaded to iCloud in the US will be given a ‘safety voucher,’ saying whether it is suspect or not,” the Financial Times said.

“Once a certain number of photos are marked as suspect, Apple will enable all the suspect photos to be decrypted and, if apparently illegal, passed on to the relevant authorities.”

Matthew Green, John Hopkins University professor and cryptographer, post on his Twitter saying, “This sort of tool can be a boon for finding child pornography in people’s phones. But imagine what it could do in the hands of an authoritarian government?”

Advertisement

“Even if you believe Apple won’t allow these tools to be misused [crossed fingers emoji] there’s still a lot to be concerned about,” he added. “These systems rely on a database of ‘problematic media hashes’ that you, as a consumer, can’t review.”

Advertisement

Apple has already been scanning the could storage for child abuse imagery. But this time it will go beyond. it will give central excess to local storage. It could also extend the system to crimes other than child abuse imagery.

According to the reports of Financial Times, the company has informed some of the academics about the new system.

Apple may release it as soon as this week, according to two security researchers who were briefed on this matter in Apple’s meeting earlier.

Apple has previously sold the privacy protections built into its devices and notoriously defied the FBI’s request for Apple to install a backdoor into iOS in order to access an iPhone used by one of the gunmen in the 2015 San Bernardino massacre.

Also Read

Coinbase users now can purchase cryptocurrency using Apple Pay
Coinbase users now can purchase cryptocurrency using Apple Pay

Coinbase announced on Thursday that consumers can now buy crypto assets on...

Advertisement
Advertisement
Read More News On

Catch all the Sci-Tech News, Breaking News Event and Latest News Updates on The BOL News


Download The BOL News App to get the Daily News Update & Follow us on Google News.


End of Article

Next Story