Apple is facing a lawsuit for its decision not to introduce a system that would have scanned iCloud photos for child sexual abuse material (CSAM). The lawsuit claims that by not taking stronger action to stop the spread of CSAM, Apple is forcing victims to relive their traumatic experiences.
The lawsuit accuses Apple of making a public promise with “a widely touted improved design aimed at protecting children,” but failing to take action by implementing “those designs or take any measures to detect and limit” this harmful content.
Also read: Apple accused of monitoring employees’ devices and stifling free speech
Apple initially announced in 2021 that it would create a system to scan iCloud photos using digital signatures from the National Center for Missing and Exploited Children and other organisations. This would have helped detect known CSAM in users’ iCloud accounts. However, the company reportedly dropped these plans after concerns were raised by security and privacy groups. They warned that implementing such a system could potentially allow government surveillance.
The current lawsuit comes from a 27-year-old woman who is suing Apple under a pseudonym, reports The New York Times (via TechCrunch). She shared that her relative molested her as an infant and shared abusive images of her online. Despite efforts to stop this, she continues to receive notices from law enforcement almost daily about individuals being arrested for possessing these same images.
James Marsh, the attorney involved with the case, has stated that as many as 2,680 victims could potentially qualify for compensation if the lawsuit is successful.
Also read: Apple’s iCloud practices spark $3.78bn legal action, here’s why
In a statement to The Times, a company spokesperson said Apple is “urgently and actively innovating to combat these crimes without compromising the security and privacy of all our users.”
This isn’t the first legal action against Apple related to CSAM detection. In August, a 9-year-old girl and her guardian also sued the company, accusing it of failing to tackle the spread of CSAM on iCloud.
The outcome of this lawsuit could have major implications for Apple and how tech companies balance privacy and child protection efforts.