
The death of Rachel Tussey, an Ohio mother of three who documented her cosmetic surgery journey on TikTok, has drawn renewed attention to the complicated and, at times, troubling intersection between social media and real-world harm.
Ms. Tussey, 47, known on TikTok as “midlifeunmuted,” died after suffering a permanent anoxic brain injury following a Feb. 25 procedure at JourneyLite Surgery Center in Evendale, Ohio. She remained on life support for roughly two weeks before dying at a hospice facility.
Her husband, Jeremy Tussey, said she became unresponsive after being administered pain medication. Her attorney has indicated that investigations will examine the actions of medical providers involved in her care.
While her death was not caused by a viral trend, it underscores a broader phenomenon: For many users, TikTok is not just a platform for sharing life events — it can shape behavior, amplify risk-taking and influence what content reaches vulnerable audiences.
The ‘Blackout Challenge’
Among the most widely reported cases tied to TikTok content is the so-called “Blackout Challenge,” which encouraged users to choke themselves until losing consciousness and record the results.
Ten-year-old Nylah Anderson of suburban Philadelphia died in December 2021 after attempting the challenge. Her mother, Tawainna Anderson, said the video appeared on her daughter’s “For You” page — TikTok’s algorithmically curated feed.
The family sued TikTok and its parent company, ByteDance, alleging the platform’s recommendation system pushed dangerous content to children. Although a federal judge initially dismissed the case under Section 230 of the Communications Decency Act, a U.S. appeals court revived it in 2024, allowing the claims to proceed.
Reports have linked roughly 20 child deaths worldwide to the challenge between 2021 and 2022, though no official comprehensive tally exists. In 2025, the parents of four British teenagers — Isaac Kenevan, 13; Archie Battersbee, 12; Julian Sweeney, 14; and Maia Walsh, 13 — filed a wrongful death lawsuit in the United States alleging TikTok’s algorithm promoted similar content to their children.
The ‘One Chip Challenge’
In September 2023, 14-year-old Harris Wolobah of Worcester, Massachusetts, died after eating an extremely spicy tortilla chip marketed as part of the viral “One Chip Challenge.”
The product, sold by Paqui, contained high concentrations of capsaicin derived from Carolina Reaper and Naga Viper peppers. The challenge — widely shared across social media, including TikTok — encouraged participants to eat the chip and endure the burning sensation without relief.
An autopsy found that Wolobah died of cardiopulmonary arrest in the context of ingesting a high-dose capsaicin product, and also noted an undiagnosed heart condition.
Following his death, Paqui pulled the product from shelves. Wolobah’s family later filed a wrongful death lawsuit against the manufacturer, its parent companies, and a retailer, alleging the product was marketed toward minors.
When content and identity blur
Not all incidents involve viral challenges.
In August 2021, Timothy Hall, 18, known on TikTok as “Timbo the Redneck,” died after the pickup truck he was spinning in a field flipped and crushed him.
Hall had built a following posting high-energy videos featuring trucks and stunts. After his death, his mother told followers he “loved TikTok” and was often motivated to create content for the platform.
While there is no evidence TikTok directly caused the accident, the case highlights how online personas and the pursuit of engagement can intersect with risky behavior.
International scrutiny
Concerns about harmful viral content are not limited to the United States.
In December 2024, Venezuela’s Supreme Tribunal of Justice fined TikTok $10 million after authorities linked three adolescent deaths to viral challenges involving the ingestion of chemical substances. The government said the funds would support victims and families affected by harmful online content.
A platform under pressure
TikTok says it prohibits dangerous challenges and self-harm content and works to remove such material proactively. The company also blocks or redirects searches related to known harmful trends.
Critics, however, argue that the platform’s core design, particularly its recommendation algorithm, can amplify risky content before moderation systems catch up, especially among younger users.
Courts are now increasingly being asked to decide where responsibility lies: with users, with content creators, or with the platforms that distribute and promote the material.
A complicated reality
The deaths linked in various ways to TikTok do not all share the same cause. Some involve alleged algorithmic promotion of dangerous challenges. Others reflect broader internet trends or the pressures of online performance.
What they collectively illustrate is a shifting landscape in which digital platforms do more than host content — they shape what people see, how they behave, and, in some cases, the risks they take.
For a generation living much of its life online, the boundary between content and consequence is becoming harder to separate.
This article was constructed with the assistance of artificial intelligence and published by a member of The Washington Times’ AI News Desk team. The contents of this report are based solely on The Washington Times’ original reporting, wire services, and/or other sources cited within the report. For more information, please read our AI policy AI policy or contact Steve Fink, Director of Artificial Intelligence, at sfink@washingtontimes.com
The Washington Times AI Ethics Newsroom Committee can be reached at aispotlight@washingtontimes.com.









