Categories: Apple

Apple sued over 2022 dropping of CSAM detection features

A victim of childhood sexual abuse is suing Apple over its 2022 dropping of a previously-announced plan to scan images stored in iCloud for child sexual abuse material.

Apple has retained nudity detection in images, but dropped some CSAM protection features in 2022.

Apple originally introduced a plan in late 2021 to protect users from child sexual abuse material (CSAM) by scanning uploaded images on-device using a hashtag system. It would also warn users before sending or receiving photos with algorithically-detected nudity.

The nudity-detection feature, called Communication Safety, is still in place today. However, Apple dropped its plan for CSAM detection after backlash from privacy experts, child safety groups, and governments.

Continue Reading on AppleInsider | Discuss on our Forums

Source: AppleInsider News

WBN

Share
Published by
WBN

Recent Posts

Lanterns Star Aaron Pierre Has Already Seen the Right Green Lantern Adaptation

The star of DC's upcoming HBO series is a fan of the Justice League cartoons,…

19 hours ago

The Honda Prologue Deals Just Won’t Quit

Thanks to aggressive incentives, the Honda Prologue is one of the hottest deals in the…

19 hours ago

Here’s everything featured in Day of the Devs’ 2024 indie showcase

The showcases continue! Following yesterday's batch of cosily tantalising indie games via this year's Wholesome…

19 hours ago

Instagram, Facebook, WhatsApp, and Threads are currently down in major Meta outage

Meta is currently experiencing a major outage across all its platforms, including Facebook, Instagram, Threads,…

20 hours ago

New visionOS 2.2 brings widescreen Mac to Apple Vision Pro

Apple has publicly released visionOS 2.2 for the Apple Vision Pro, bringing small but significant…

20 hours ago

Apple rolls out watchOS 11.2 update, primarily addressing bug fixes

Now's the time to update your Apple Watch, as Apple has released watchOS 11.2 to…

20 hours ago