A victim of childhood sexual abuse is suing Apple over its 2022 dropping of a previously-announced plan to scan images stored in iCloud for child sexual abuse material.
Apple originally introduced a plan in late 2021 to protect users from child sexual abuse material (CSAM) by scanning uploaded images on-device using a hashtag system. It would also warn users before sending or receiving photos with algorithically-detected nudity.
The nudity-detection feature, called Communication Safety, is still in place today. However, Apple dropped its plan for CSAM detection after backlash from privacy experts, child safety groups, and governments.
Continue Reading on AppleInsider | Discuss on our Forums
Source: AppleInsider News
Chinese megacorp Tencent has doubled down on its assertions that it is not a Chinese…
The Dreamcast burned brightly from 1998 to 2001 but Sega's financial situation famously ended its…
Awesome Games Done Quickly has "come to a thrilling close", securing an incredible $2,556,305 -…
Long live Resident Evil 4, the game that changed its own franchise, the survival-horror genre,…
A four-year-old class action lawsuit filed in the UK again Apple's App Store fees will…
As part of a plea deal, a former Disney employee has admitted to hacking the…