New documentary, lawsuit target social media giants after Olympia teen’s fentanyl death
Aug 24, 2025, 5:00 AM
In this photo illustration, the logos of social media applications Instagram, Facebook, LinkedIn, Messenger, Hyperlapse, and Telegram are displayed on a cellphone screen. (Photo illustration: Buda Mendes, Getty Images)
(Photo illustration: Buda Mendes, Getty Images)
For most residents in Thurston County, Dec. 19, 2024, was an ordinary Pacific Northwest winter day — chilly, mostly cloudy, with brief sun breaks in the afternoon.
But for the Ping family, that day remains forever dark. Their 16-year-old son, Avery Ping, died that Thursday evening from a drug overdose, according to doctors. The moments and images from that day continue to haunt them.
Home surveillance video footage appeared to show a drug transaction at the Ping’s home, court records stated. Avery approached a Toyota Prius at the end of his driveway, where he interacted with someone inside. Later that evening, video footage showed Avery outside the home with two other men. He appeared visibly impaired — stumbling, swaying, and speaking incoherently to a home security camera tracking his movements. Emergency crews were later seen arriving. Avery was transported to the hospital.
Avery believed he had purchased MDMA, or ecstasy, according to Avery’s family. However, they said medical staff told them Avery actually had a lethal dose of fentanyl in his system.
Avery’s family also told investigators the suspected dealer used Snapchat to sell drugs to their son and multiple other Olympia High School students. However, Avery’s story is far from isolated.
According to the Seattle-based Social Media Victims Law Center (SMVLC), similar tragedies unfold thousands of times each year. The group has filed more than 1,500 lawsuits against tech companies, including Snapchat, Meta (owner of Instagram and Facebook), and TikTok — accusing them of driving a youth mental health crisis through addictive algorithms and platform negligence. The consequences, they argue in court documents, include deadly drug overdoses, self-harm, and even suicide.
SMVLC’s work is also the focus of a documentary released in April, titled “Can’t Look Away: The Case Against Social Media,” which dives into their emotional fight for grieving families and their journey from Capitol Hill in Washington, D.C. to the courtroom, to convince tech companies to modify their algorithms.
“I think it resonates with every parent in America,” SMVLC Attorney Matthew Bergman told KIRO Newsradio. “Anyone who has kids experiences the consequences of social media addiction, and some of the parents experience very horrific consequences.”
One of the lawsuits highlighted in the documentary includes a landmark suit against Snapchat. Bergman told KIRO Newsradio, during the discovery phase — which is the pretrial phase where both parties are allowed to seek information from one another — SMVLC has uncovered emails and documents that corroborate previous whistleblowers’ claims that Snapchat uses psychologists, neurologists, and operant conditioning, which is a method that modifies a user’s behavior through rewards and punishments.
“They design platforms that are addictive to young people,” Bergman explained. “They know that their platforms are being used to facilitate drug deals, and they keep doing it.”
According to Pew Research, 95% of teens use social media, and a third said they are on it constantly. More than half of all teen girls report persistent sadness or hopelessness, according to the CDC.
The American Psychological Association and U.S. Surgeon General have both issued urgent warnings linking social media to rising rates of anxiety, depression, and suicidal ideation among youth.
In response, TikTok, Meta, and Snapchat have created robust safeguards for teenagers and parents to help them regulate the amount of time their kids spend on their platforms. In response to this story, all three tech companies sent KIRO Newsradio statements and links to those safeguards, which we have shared, in full, below.
However, Bergman argued, while the companies claim to support moderation, they do very little to actually enforce it.
“They’re trying to, on one hand, say you shouldn’t use it too much, and on the other hand, provide products that are psychologically and neurologically addictive,” Bergman said. “If they just turn off the algorithms and show kids what they want to see, not what they can’t look away from the mental health crisis that we would be seeing in our young people would not exist, and our kids would be healthier and safer, and we’d be on a much better trajectory.”
In January, the Olympia Police Department (OPD), along with a SWAT team, arrested a 33-year-old man for controlled substance homicide in the Ping case. While the case is still pending, court documents say it was “clear” the suspect’s Snapchat account was used for selling drugs.
In March, OPD sent a warning to families to beware of online, social media drug dealers targeting kids.
“They put an emoji in the search title or put in ‘drug dealer’ in the search title, and the app will use their location and let them know dealers within their location,” Olympia Police detective Patrick Hutnik said. “They pay with a Cash App and can have it delivered to their home.”
The Ping family is scheduled to take part in a panel discussion after a free screening of the documentary — which is based on reporting by Bloomberg Media — at West Seattle’s Admiral Theater on Aug. 28 at 6 p.m. Joining them will be SMVLC attorneys, policy experts, and a former Meta executive-turned-whistleblower. Here is a link for tickets to the screening.
Below are the statements and links sent to KIRO Newsradio from Meta, TikTok, and Snapchat.
Statement from Meta
“We know parents are worried about their teens having unsafe or inappropriate experiences online, and that’s why we’ve significantly changed the Instagram experience for tens of millions of teens with new Teen Accounts,” a Meta spokesperson told KIRO Newsradio. “These accounts provide teens with built-in protections to automatically limit who’s contacting them and the content they’re seeing, and teens under 16 need a parent’s permission to change those settings. We also give parents oversight over their teens’ use of Instagram, with ways to see who their teens are chatting with and block them from using the app for more than 15 minutes a day, or for certain periods of time, like during school or at night.”
Here is more information concerning teen accounts.
Here is more information, from April 2025, on other updates.
Here is more information on Meta’s Family Center.
Here is more information on the full timeline of updates, provided by Meta, to support parents of teens.
Here’s more information on how we think about ensuring people are the right age they say they are.
Statement from TikTok
“While the average age of a US user on TikTok is over 30, we recognize that special safeguards are required to protect teens online, which is why TikTok offers an experience for teens that is much more restrictive than the experience for adults on our platform,” TikTok said in an official statement. “For example, teens under 16 are set to private by default and can’t access direct messaging, while all teens under 18 have a 60-minute screentime limit by default, can’t host a livestream, or receive virtual gifts.
“Our Family Pairing tools allow parents to customize their teen’s TikTok experience to fit their unique needs,” the statement continued. “With access to over 15 safety, privacy, and well-being controls, parents can block app access during specific times, set personalized daily screen time limits, block push notifications, see who their teen follows and who follows them, and filter or restrict content in their teen’s feed — empowering families to support healthy digital habits in a way that works for them.”
Any content that promotes dangerous behavior, which may lead to serious injury, is a violation of TikTok’s community guidelines and would be removed from the platform.
“We also take rigorous steps to safeguard the For You feed for teens and our entire community,” TikTok stated. “We make some content ineligible for the For You feed, such as violence or graphic content that may be newsworthy. On top of this, through Content Levels, we prevent some content that may contain more mature or complex themes from reaching teens on TikTok.”
In the first quarter of 2025, 99.8% of the videos removed for violating the dangerous challenges policy were removed proactively, while 92.4% of those videos received no views. Additionally, 97.5% were removed within 24 hours.
“To further discourage such content from being posted or replicated, we do not show videos of known dangerous challenges in search results,” TikTok continued. “Instead, we direct searches and hashtags to an Online Challenges Safety Center page we created with guidance from youth safety experts, an adolescent development doctor, and a behavioral scientist specializing in risk prevention in adolescence.”
Statement from Snapchat
“Snap condemns the horrific criminal behavior by drug dealers that led to these tragedies, and we have deep empathy for the families who have endured heartbreaking losses,” a Snap spokesperson told KIRO Newsradio. “Most of Snapchat’s hundreds of millions of daily users use the platform as designed and intended – to engage with their friends and family spontaneously and authentically as they do in real life. We are committed to making Snapchat a safe and fun environment for our community, and have built privacy and safety features into our service from the start.”
Snapchat accounts are private by default. Like all Snapchat accounts, teen accounts are private by default. This means that friend lists are private — only the individual can see their friend list, and we don’t show their friends to anyone else, other than their parent or guardian if they are connected via family center.
“We continue to improve and add to our Family Center suite of tools, released in 2022, where parents can see who their teen is friends with on Snapchat, who they have chatted with recently, and easily report accounts that may be of concern to them,” the Snapchat person continued. ”We also recently released and promoted a refresh to our Family Safety Hub site. Snap will continue to focus on protecting our community by prioritizing longstanding partnerships, supporting law enforcement, and collaborating with online safety experts, parents, educators, and policymakers toward our shared goal of keeping people – and especially young people – safe online.”
Follow Luke Duecy on X. Read more of his stories here. Submit news tips here.


