A victim of childhood sexual abuse is suing Apple over its 2022 dropping of a previously-announced plan to scan images stored in iCloud for child sexual abuse material.
Apple originally introduced a plan in late 2021 to protect users from child sexual abuse material (CSAM) by scanning uploaded images on-device using a hashtag system. It would also warn users before sending or receiving photos with algorithically-detected nudity.
The nudity-detection feature, called Communication Safety, is still in place today. However, Apple dropped its plan for CSAM detection after backlash from privacy experts, child safety groups, and governments.
Continue Reading on AppleInsider | Discuss on our Forums
Source: AppleInsider News
TTArtisan's AF 75mm f/2 portrait prime lens, previously available only for E-mount and Z-mount, now…
Details are scarce, but this has the potential to be a disaster.
Trevor Milton, who was sentenced to four years in prison last year for deceptively overselling…
Danielle Deadwyler stars in the new film from director Jaume Collet-Serra and producer Jason Blum.
Smart home automation is all about convenience or security. For me, automation with motorized shades…
How a car looks has no tangible impact on how it drives or performs, but…