Apple Sued for Not Implementing CSAM Detection Feature on iCloud

Apple is being sued for dropping its plan to scan iCloud photos for child sexual abuse material (CSAM) after the company cited security and privacy concerns.

[Read More]

Source: PetaPixel