Thousands of victims have sued Apple over its alleged failure to detect and report illegal child pornography, also known as ...
Apple is facing a lawsuit seeking $1.2 billion in damages over its decision to abandon plans for scanning iCloud photos for ...
A second suit says Apple isn't doing enough to stop the spread of harmful images and videos and that it's revictimizing the ...
Apple is being sued over its decision not to implement a system that would have scanned iCloud photos for child sexual abuse ...
It claims that, after Apple showed off its planned child safety tools, the company “failed to implement those designs or take ...
State police charged a Cumbola man with possessing child sexual abuse materials (CSAM). Gary Jon Hysock Jr., 36, initially denied seeing or CSAM when asked by police Tuesday. Law enforcement received ...
Thousands of CSAM victims are suing Apple for dropping plans to scan devices for the presence of child sexual abuse materials. In addition to facing more than $1.2B in penalties, the company could be ...
Thousands of victims banded together for a proposal regarding a class action lawsuit against Apple, with the company now ...
Announced in 2021, the plan was for Apple to scan images on iCloud for child abuse material using on-device technology. While ...
Apple faces a $1.2 billion lawsuit for failing to address child sex abuse material (CSAM) after cancelling a detection tool.
This year, 18 states passed laws that make clear that sexual deepfakes depicting minors are a crime. Experts say schools should update their policies to account for these AI-generated images as well.
New class action suit filed by thousands of victims accuses the tech giant of hiding behind cybersecurity concerns instead of ...