Categories: Apple

Apple sued over 2022 dropping of CSAM detection features

A victim of childhood sexual abuse is suing Apple over its 2022 dropping of a previously-announced plan to scan images stored in iCloud for child sexual abuse material.

Apple has retained nudity detection in images, but dropped some CSAM protection features in 2022.

Apple originally introduced a plan in late 2021 to protect users from child sexual abuse material (CSAM) by scanning uploaded images on-device using a hashtag system. It would also warn users before sending or receiving photos with algorithically-detected nudity.

The nudity-detection feature, called Communication Safety, is still in place today. However, Apple dropped its plan for CSAM detection after backlash from privacy experts, child safety groups, and governments.

Continue Reading on AppleInsider | Discuss on our Forums

Source: AppleInsider News

WBN

Share
Published by
WBN

Recent Posts

Tencent threatens “legal proceedings” if US doesn’t overturn decision toit as a Chinese military company

Chinese megacorp Tencent has doubled down on its assertions that it is not a Chinese…

2 hours ago

The new Grand Theft Auto 3 Dreamcast port is an astonishing achievement

The Dreamcast burned brightly from 1998 to 2001 but Sega's financial situation famously ended its…

2 hours ago

This year’s AGDQ included someone speedrunning Elden Ring bosses with a saxophone

Awesome Games Done Quickly has "come to a thrilling close", securing an incredible $2,556,305 -…

3 hours ago

Resident Evil 4 Didn’t Just Change Everything, It Is Everything

Long live Resident Evil 4, the game that changed its own franchise, the survival-horror genre,…

3 hours ago

UK trial over Apple’s App Store fees seeks $1.83B fine

A four-year-old class action lawsuit filed in the UK again Apple's App Store fees will…

3 hours ago

Former Disney Employee Admits to Hacking Menu System to Change Allergy Information

As part of a plea deal, a former Disney employee has admitted to hacking the…

3 hours ago