A victim of childhood sexual abuse is suing Apple over its 2022 dropping of a previously-announced plan to scan images stored in iCloud for child sexual abuse material.
Apple originally introduced a plan in late 2021 to protect users from child sexual abuse material (CSAM) by scanning uploaded images on-device using a hashtag system. It would also warn users before sending or receiving photos with algorithically-detected nudity.
The nudity-detection feature, called Communication Safety, is still in place today. However, Apple dropped its plan for CSAM detection after backlash from privacy experts, child safety groups, and governments.
Continue Reading on AppleInsider | Discuss on our Forums
Source: AppleInsider News
How does one legally turn the Mickey Mouse character Steamboat Willie into a horror killer?…
In a small Phase II trial, people with cocaine use disorder who took mavoglurant used…
In all honesty, I watched the first half of the first season of Alert: Missing…
Amid all the excitement of Switch 2's big news bonanza, Nintendo has quietly confirmed some…
Following the release of iOS 18.4, macOS 15.4, and other updates to the public, Apple…
The smash hit Apple TV+ series wrapped up its second season in March.