A newly unredacted court filing in the United States sheds fresh light on how *Meta evaluated the impact of its platforms and why some of those findings never reached the public. The documents, part of a lawsuit brought by multiple school districts against *Meta, Google, TikTok, and Snapchat, reveal that *Meta obtained direct internal evidence that *Instagram and *Facebook negatively affect users’ mental health.

Inside “Project Mercury”

At the centre of the claims is Project Mercury, an internal study conducted roughly five years ago. According to the filing, Meta analysed what happened when users stepped away from its platforms for a week. The result: reported drops in depression, anxiety, loneliness, and the relentless comparison loop many users describe.

The plaintiffs argue that these results were so damaging that *Meta shut the project down. Internally, executives reportedly attributed the decision to “a negative media narrative”, suggesting the data was tainted by public criticism, an explanation the lawsuit challenges.

Why It Matters: Potentially Misleading Congress

If the claims hold up, they directly contradict Meta’s previous testimony before the U.S. Congress. The company has repeatedly stated that it could not quantify the impact of its products on teenage mental health. The unredacted documents indicate that it could, and did.

The filing also outlines a number of internal shortcomings:

Child-safety systems were designed in ways that made them rarely activated.

Potentially harmful product features received limited testing.

Moderation tools sometimes acted only after severe or repeated violations; in one cited case, an account allegedly made 17 attempts to facilitate human trafficking before being removed.

Internal teams knew that harmful content increased teen engagement but continued surfacing it because it improved metrics.

One particularly stark allegation: in 2021, Mark Zuckerberg reportedly said that child safety was not his priority, as Meta’s top resources were directed toward building the metaverse.

Meta’s Response

Meta denies the accusations outright. Company spokesperson Andy Stone said the internal excerpts were “taken out of context” and that Project Mercury was discontinued due to flawed methodology, not inconvenient results. He also emphasized that teen safety is a core priority and that Meta’s anti-trafficking policy now mandates immediate removal of accounts following verified complaints.

What Happens Next

A hearing is scheduled for January 26, 2026, in federal court in Northern California. Meta has already asked the court to dismiss the internal documents from the case. If the judge allows them in, Meta will face significant questions: why its internal research reportedly conflicted with its public stance, and why product decisions continued even as evidence of harm accumulated.

This case could become one of the most consequential legal examinations of platform accountability in the social-media era.

Leave a Reply

Your email address will not be published. Required fields are marked *