Apple intends to scan working iPhones, identifying images of children to check for abuse. About it informs Financial Times citing a source.
According to the newspaper, the system will constantly compare photos on the iPhone and in the iCloud cloud against a database of 200,000 images of child abuse collected by human rights activists. Each photo will be encrypted and flagged for suspicion. If a certain number of suspicious photos are typed, Apple will decrypt these files for verification by specialists. If the company’s specialists are convinced of the illegality of the images, they can transfer them to law enforcement agencies.
According to the publication, the system proposed by Apple is called neuralMatch. So far, its implementation is planned only in the United States, where the company will install special software on smartphones.
Apple declined to comment.
Data security experts surveyed by the publication fear that scanning an iPhone will be the first step to spying on millions of personal devices. According to experts, the authorities of different countries may try to get access to the personal data of citizens through Apple.