Apple intends to install software on American iPhones to scan for child abuse imagery, according to people briefed on its plans, raising alarm among security researchers who warn that it could open the door to surveillance of millions of people’s personal devices.
Apple detailed its proposed system—known as “neuralMatch”—to some US academics earlier this week, according to two security researchers briefed on the virtual meeting. The plans could be publicized more widely as soon as this week, they said.
The automated system would proactively alert a team of human reviewers if it believes illegal imagery is detected, who would then contact law enforcement if the material can be verified. The scheme will initially roll out only in the US.
Read 15 remaining paragraphs | Comments
source https://arstechnica.com/?p=1785312