From X's new AI image generation tool to ChatGPT's $200 Pro plan – here's everything that happened in the tech world this ...
State police charged a Cumbola man with possessing child sexual abuse materials (CSAM). Gary Jon Hysock Jr., 36, initially denied seeing or CSAM when asked by police Tuesday. Law enforcement received ...
This year, 18 states passed laws that make clear that sexual deepfakes depicting minors are a crime. Experts say schools should update their policies to account for these AI-generated images as well.
New class action suit filed by thousands of victims accuses the tech giant of hiding behind cybersecurity concerns instead of ...
Research led by the University of Southampton suggests that police investigating Child Sexual Abuse Material (CSAM) ...
Thousands of victims banded together for a proposal regarding a class action lawsuit against Apple, with the company now ...
Apple is now facing a $1.2 billion lawsuit over its decision to drop plans for scanning photos stored in iCloud for child ...
Thousands of CSAM victims are suing Apple for dropping plans to scan devices for the presence of child sexual abuse materials. In addition to facing more than $1.2B in penalties, the company could be ...
A second suit says Apple isn't doing enough to stop the spread of harmful images and videos and that it's revictimizing the ...
Thousands of victims have sued Apple over its alleged failure to detect and report illegal child pornography, also known as ...
Apple is facing a lawsuit seeking $1.2 billion in damages over its decision to abandon plans for scanning iCloud photos for ...
Apple is being sued over its decision not to implement a system that would have scanned iCloud photos for child sexual abuse ...