A victim of childhood sexual abuse is suing Apple over its 2022 dropping of a previously-announced plan to scan images stored in iCloud for child sexual abuse material.
Apple originally introduced a plan in late 2021 to protect users from child sexual abuse material (CSAM) by scanning uploaded images on-device using a hashtag system. It would also warn users before sending or receiving photos with algorithically-detected nudity.
The nudity-detection feature, called Communication Safety, is still in place today. However, Apple dropped its plan for CSAM detection after backlash from privacy experts, child safety groups, and governments.
Continue Reading on AppleInsider | Discuss on our Forums
Source: AppleInsider News
Speaking to io9, showrunner Russell T Davies confirmed that a new documentary will look back…
As someone who grew up in the late 90s and early 2000s, I played a…
iOS 18.4 packs a lot of new features, but one of my top picks is…
Award-winning actress Tilda Swinton to curate gallery exhibition featuring portraits of herself titled Tilda Swinton…
Scientists assumed most forms of life before the Great Oxidation Event didn't metabolize oxygen—but recent…
The Trump ally says he wants to "ensure Congress has a voice in trade policy."