Apple Inc. is ruffling feathers with a controversial plan to install software on all iPhones that detects child abuse imagery, Financial Times reports. The company plans to roll out software called “neuralMatch” that scans photos stored on a user’s device and iCloud account, then flags concerning images to a team of reviewers. Privacy experts compared the move to George Orwell’s 1984. “It is an absolutely appalling idea, because it is going to lead to distributed bulk surveillance of… our phones and laptops,” said Ross Anderson, a professor at the University of Cambridge. Apple recently said it would overhaul Siri after concerns of data being sent to Apple servers, The Guardian reports. Testing for “neuralMatch” has already begun with the help of the National Center for Missing and Exploited Children.
Read it at Financial Times