A child sexual abuse survivor has sued Apple, accusing the company of neglecting its proposed system to detect CSAM on iCloud. The plaintiff alleges Apple’s inaction enables the spread of abusive images, compounding victims’ trauma. Apple, which shelved the system over privacy concerns, faces demands for accountability as the lawsuit represents over 2,600 potential victims. Apple says it continues to innovate against CSAM without compromising user privacy.