A victim of childhood sexual abuse is suing Apple over its 2022 dropping of a previously-announced plan to scan images stored in iCloud for child sexual abuse material.
Apple originally introduced a plan in late 2021 to protect users from child sexual abuse material (CSAM) by scanning uploaded images on-device using a hashtag system. It would also warn users before sending or receiving photos with algorithically-detected nudity.
The nudity-detection feature, called Communication Safety, is still in place today. However, Apple dropped its plan for CSAM detection after backlash from privacy experts, child safety groups, and governments.
Continue Reading on AppleInsider | Discuss on our Forums
Source: AppleInsider News
Somerset amassed 637-6 against Worcestershire with Tom Banton hitting 344 not out at the County…
The British Broadcasting Corporation has complained to a UK antitrust authority that Apple and Google's…
An underwater photographer helped save an endangered loggerhead sea turtle that is believed to be…
Last month, Apple introduced a number of new products, including a for $599. While that…
Physical copies of OG Switch games upgraded for Nintendo Switch 2, like The Legend of…
“It became so repetitive and I really had an overwhelming urge to do something different”…