Court documents filed in a lawsuit against Meta allege that the company failed to inform anyone of risks to young users on the platform and had a cavalier attitude about adults preying on minors.
The allegations add that Meta chose not to take steps to increase safety for young users, according to Time.
The brief, which contains the claims against Meta, was filed in the Northern District of California. It claimed Meta knew its products worsened teen mental health, while content concerning eating disorders, suicide, and child sexual abuse was often found but allegedly rarely taken down.
In the brief, Instagram’s former head of safety and well-being, Vaishnavi Jayakumar, said that Meta, which she joined in 2020, had a lax policy in terms of “trafficking of humans for sex.”
According to the brief, Meta was aware that millions of adult strangers were contacting minors on its sites; that its products exacerbated mental health issues in teens; and that content related to eating disorders, suicide, and child sexual abuse was frequently detected, yet…
— TIME (@TIME) November 22, 2025
“You could incur 16 violations for prostitution and sexual solicitation, and upon the 17th violation, your account would be suspended,” Jayakumar said, adding, “by any measure across the industry, [it was] a very, very high strike threshold.”
“Meta never told parents, the public, or the Districts that it doesn’t delete accounts that have engaged over fifteen times in sex trafficking,” the plaintiffs wrote in the court filing.
“Meta has designed social media products and platforms that it is aware are addictive to kids, and they’re aware that those addictions lead to a whole host of serious mental health issues,” Previn Warren, an attorney for the plaintiffs in the case, said.
“Meta employees proposed multiple ways to mitigate these harms, according to the brief, but were repeatedly blocked by executives who feared that new safety features would hamper teen engagement or user growth.” https://t.co/ydwnzMoSS0
— Dani Pinter (@DaniBPinter) November 22, 2025
“Like tobacco, this is a situation where there are dangerous products that were marketed to kids. They did it anyway, because more usage meant more profits for the company,” he said.
The lawsuit, which includes more than 1,800 plaintiffs, says the parent companies for Instagram, TikTok, Snapchat, and YouTube “relentlessly pursued a strategy of growth at all costs, recklessly ignoring the impact of their products on children’s mental and physical health.”
Although Time noted that Meta has made some changes, it said the brief includes testimony from Brian Boland, Meta’s former vice president of partnerships, who left in 2020 after 11 years.
“My feeling then and my feeling now is that they don’t meaningfully care about user safety,” he said. “It’s not something that they spend a lot of time on. It’s not something they think about. And I really think they don’t care.”
Meta did not respond to Time’s request for comment.
The lawsuit noted that five years before Instagram took teen accounts private to avoid adults contacting underage users, Meta considered what to do about that problem. In 2020, it determined it could lose 1.5 million monthly active teens per year, with an employee quoted as saying that taking action “is likely to lead to a potentially untenable problem with engagement and growth.”
Plaintiffs in the national Meta teen social media harm litigation filed a new brief 11/21/25 and Time magazine has published an extensive article on the harms alleged in the case. Here’s the link to the brief https://t.co/HvdsgfBjrG (1/2) pic.twitter.com/EtkA2YwoPn
— Lieff Cabraser (@LieffCabraser) November 22, 2025
“Meta knew that placing teens into a default-private setting would have eliminated 5.4 million unwanted interactions a day,” the lawsuit said.
The lawsuit added that although Meta’s artificial intelligence tools detected harmful content, such as self-harm, child sex abuse, or eating disorder content, most of the content was never taken down.
Advertise with The Western Journal and reach millions of highly engaged readers, while supporting our work. Advertise Today.










