The American smartphone giant, Apple, is facing a fresh lawsuit from the state of West Virginia alleging that the company has failed to curb child sexual abuse materials on iOS devices and iCloud services, CNBC has reported.
The lawsuit has been filed by Republican West Virginia Attorney General John “JB” McCuskey. The attorney has accused Apple of prioritising privacy branding and its own business interests, and sidelining child safety.
McCuskey has maintained that other leading tech companies, including Google, Microsoft, and Dropbox, have been more proactive in this regard and have been using systems like PhotoDNA to curb such material.
What Is PhotoDNA
Developed by Microsoft and Dartmouth College in 2009, this system uses “hashing and matching” to automatically identify and block child sexual abuse material (CSAM) images, when they have already been identified and reported to the authorities.
Apple, in 2021, has tested its own CSAM-detection feature that could find and remove such images, and report the ones that have been uploaded to iCloud to the National Centre for Missing & Exploited Children.
ALSO READ: Social Media Bans For Teens: After Australia, More Nations Mulling Similar Curbs - Check List
Privacy advocates raised concerns that this feature might allow a back channel for government surveillance, following which Apple withdrew its plan to implement the system.
Any efforts by Apple since have not been satisfactory to a broad array of critics, leaving iCloud and iOS users vulnerable to such content.
This is not, however, the first time Apple has been facing such allegations. In 2024, the UK-based watchdog National Society for the Prevention of Cruelty to Children said that the company failed to adequately monitor, tabulate and report CSAM in its products to the authorities.
Again in the same year, a lawsuit was filed in California's Northern District, where thousands of child sexual abuse survivors sued Apple. They alleged that the company never should have abandoned earlier plans for CSAM detection features, and Apple's negligence has made them relive the trauma.
The company has been positioning itself as privacy-sensitive, but if this present lawsuit is successful, it might force the company to make design and security changes.
A spokesperson of Apple told CNBC, in an emailed statement, that protecting the safety and privacy of their users, especially children, was central to what the company does.
The company has pointed to parental controls and features like communication safety, which “automatically intervenes on kids' devices " when nudity is detected in messages, shared photos, AirDrop, and even live FaceTime calls.
“This is an indication of our commitment to provide safety, security, and privacy to the users,” the spokesperson was quoted as saying, “We are innovating every day to combat ever-evolving threats and maintain the safest, most trusted platform for kids.”
ALSO READ: Apple Unveils Plans For Special Multi-City Event On March 4: What To Expect?
Essential Business Intelligence, Continuous LIVE TV, Sharp Market Insights, Practical Personal Finance Advice and Latest Stories — On NDTV Profit.