Technology

Apple sued for dropping CSAM detection for iCloud

Published

on

Apple was sued over its decision to not implement a system that might scan iCloud Photos for child sexual abuse material (CSAM).

The lawsuit argues that not doing more to forestall the spread of this material forces victims to relive their trauma, based on The New York Times.. The lawsuit describes that Apple announced “widely advertised enhanced designs intended to protect children” after which “failed to implement those designs or take any action to detect and limit” the fabric.

Apple first announced the system in 2021, explaining that it could use digital signatures from the National Center for Missing and Exploited Children and other groups to detect known CSAM content in users’ iCloud libraries. But those plans gave the impression to be abandoned after security and privacy advocates suggested they may create a backdoor into government surveillance.

The lawsuit reportedly comes from a 27-year-old woman who’s suing Apple under a pseudonym. She said a relative molested her when she was an infant and shared photos of her on the Internet, and that she still receives almost each day notices from law enforcement charging someone with possessing the photos.

Attorney James Marsh, who’s involved within the lawsuit, said there was a possible group of two,680 victims within the case who could possibly be entitled to compensation.

TechCrunch has reached out to Apple for comment. An organization spokesperson told The Times that the corporate is “urgently and proactively innovating to combat these crimes without compromising the security and privacy of all our users.”

in August A 9-year-old girl and her guardian sued Appleaccusing the corporate of failing to implement CSAM in iCloud.

This article was originally published on : techcrunch.com

Leave a Reply

Your email address will not be published. Required fields are marked *

Trending

Exit mobile version