ADVERTISEMENT

Meta Buried 'Casual' Evidence Linking Social Media's Negative Impact On Mental Health: Report

Meta called off work and internally declared that the negative study findings were tainted by the “existing media narrative” around the company.

<div class="paragraphs"><p>Earlier, Meta announced major changes to its content moderation practices. (Photo Source: Wikimedia Commons)</p></div>
Earlier, Meta announced major changes to its content moderation practices. (Photo Source: Wikimedia Commons)
Show Quick Read
Summary is AI Generated. Newsroom Reviewed

Meta allegedly shut down internal research after discovering causal evidence that Facebook and Instagram negatively impacted users’ mental health, according to unredacted court filings in a US class action lawsuit, global news agency Reuters reported.

The documents claim that a 2020 study, code-named “Project Mercury,” found users reported lower depression, anxiety, and loneliness after deactivating the platforms, findings Meta chose not to publish, the report further added. To the company’s disappointment, “people who stopped using Facebook for a week reported lower feelings of depression, anxiety, loneliness and social comparison,” internal documents said.

Opinion
Don't Chase Google, Meta: IBM Chief Scientist Urges Graduates To Examine Sectors Outside Silicon Valley

“Rather than publishing those findings or pursuing additional research, the filing states, Meta called off further work and internally declared that the negative study findings were tainted by the “existing media narrative” around the company,” the report said.

Earlier this year Meta announced major changes to its content moderation practices including an end to its fact-checking programme, starting with the United States. Meta’s platforms – which include Facebook, Instagram and Threads – will no longer employ human fact-checkers and moderation teams, relying instead on a user-sourced “community notes” model. This is a similar method to current content moderation on X (formerly Twitter).

In a statement on Saturday, Meta spokesman Andy Stone said the study was stopped because its methodology was flawed and that it worked diligently to improve the safety of its products. “The full record will show that for over a decade, we have listened to parents, researched issues that matter most, and made real changes to protect teens,” he said.

Opinion
'Vibe Code' And Master AI Tools: Meta AI Chief's Counsel To Teens For Replicating Bill Gates' Success

The Conversation reported that after the company announced the changes comments like “gays are freaks” or “transgender people are mentally ill” are now permissible. This would create significant risks for users and for the healthcare services sharing info online.

Earlier in 2024, Mark Zuckerberg was seeking to avoid being held personally liable in two dozen lawsuits accusing Meta Platforms Inc. and other social media companies of addicting children to their products.

Opinion
Meta May Partner With Sify, Lease 500 MW Vishakhapatnam Data Centre In Rs 15,266 Crore Project: Report
OUR NEWSLETTERS
By signing up you agree to the Terms & Conditions of NDTV Profit