A victim of childhood sexual abuse is suing Apple over its 2022 dropping of a previously-announced plan to scan images stored in iCloud for child sexual abuse material.
Apple originally introduced a plan in late 2021 to protect users from child sexual abuse material (CSAM) by scanning uploaded images on-device using a hashtag system. It would also warn users before sending or receiving photos with algorithically-detected nudity.
The nudity-detection feature, called Communication Safety, is still in place today. However, Apple dropped its plan for CSAM detection after backlash from privacy experts, child safety groups, and governments.
Continue Reading on AppleInsider | Discuss on our Forums
Source: AppleInsider News
The star of DC's upcoming HBO series is a fan of the Justice League cartoons,…
Thanks to aggressive incentives, the Honda Prologue is one of the hottest deals in the…
The showcases continue! Following yesterday's batch of cosily tantalising indie games via this year's Wholesome…
Meta is currently experiencing a major outage across all its platforms, including Facebook, Instagram, Threads,…
Apple has publicly released visionOS 2.2 for the Apple Vision Pro, bringing small but significant…
Now's the time to update your Apple Watch, as Apple has released watchOS 11.2 to…