Despite TikTok’s efforts, the algorithm can’t filter it all; some people find workarounds to access the videos, using Google searches, misspelled hashtags and more
by Julie Jargon
“I still see posts related to eating disorders on my feed at least three times a day,” says the 15-year-old high-school sophomore from Bellingham, Wash., who’s been struggling with unhealthy eating habits since middle school.
Nine months after a Wall Street Journal investigation showed that TikTok’s algorithms were flooding teens’ For You pages with videos encouraging weight loss and disordered eating, there are still plenty of them on the platform.
ByteDance Ltd.-owned TikTok said last December it would adjust its automated recommendations in general to avoid overly focusing on one type of content. Two months later, the company expanded its ban on eating-disorder videos, including those about overexercise and short-term fasting. TikTok has blocked searches for hashtags related to eating disorders from its search engine and is rolling out a tool allowing users to flag hashtags and block videos themselves.
Content creators—along with viewers—have nonetheless figured out ways to evade TikTok’s measures, according to data gathered by Within Health, an online provider of eating-disorder treatment.
The company’s founders, a doctor and an entrepreneur who both had struggled with eating disorders, studied TikTok due to its popularity with teen girls and to better understand its unique algorithm. Within Health executives say they’re concerned that the workarounds creators and viewers have discovered could be a bigger problem as TikTok’s competitors emulate its model of serving content chosen by an algorithm. Instagram, which now has TikTok-like Reels short videos, has previously been linked with hosting content that can harm teen girls. Instagram parent Meta Platforms Inc. has said that many teen girls have reported that Instagram has made them feel better.
Policing a social-media platform with more than a billion monthly users is an enormous task, one made even harder by the difficulty in identifying all potentially harmful videos. The stakes for removing harmful content are especially high since so many of TikTok’s users are young. Two-thirds of U.S. teens are on TikTok, according to Pew Research Center.
When someone types #anorexic in the TikTok app or website, they’ll see a message directing them to the National Eating Disorders Association, or NEDA. But until recently, all someone had to do to find anorexia videos was go to Google and search “#anorexic TikTok.” When I checked in August, TikTok videos with the #anorexic tag had been viewed nearly 34 million times, up 17% since June, when Within Health began tracking the searches.
Since I brought this to TikTok’s attention, people can no longer get to videos with the #anorexic tag via Google; an error page now appears when you click on the link from the Google search.
There’s another workaround, though: misspellings of eating-disorder-related terms intended to slip through TikTok’s filter. Videos with the hashtag #anotexic are still reachable via both TikTok and Google, as are videos containing the hashtag #orthoreixa, a misspelling of orthorexia—an obsession with healthy eating that results in a potentially hazardous diet.
And some TikTok diet and exercise content, while fine for some, may be unhealthy for others, say eating-disorder experts. Those kinds of videos don’t necessarily violate TikTok’s community guidelines.
“Caring for a diverse, global community is a responsibility we take seriously. We’re mindful that triggering content is varied and unique to each individual,” Eric Han, TikTok’s head of safety for the U.S., said in a written statement. TikTok declined to make him available for an interview.
Naomi Sanders, a teen who has struggled with unhealthy eating habits, says she can’t avoid seeing dieting videos on TikTok.
Naomi no longer searches for content related to eating disorders, but says it turns up anyway. She says she blocks accounts from certain creators, reports those that promote disordered eating behavior and chooses “not interested” on categories of videos she no longer wants to see.
She says the videos that appear on her For You page don’t usually promote extreme weight-loss techniques, but contain more subtle messages about dieting and exercise.
One popular video theme across TikTok and other networks is “What I Eat in a Day.” These lists of foods often clock in well below the calorie count recommended by the Food and Drug Administration. (For a young adult woman, that’s 2,000 to 2,400 calories a day.) For someone advised by a doctor to lose weight, the videos might provide useful ideas. But it can be hard for teens to tell healthy messages from unhealthy ones.
“How can we expect our kids to sift through all of that and know what’s safe, reliable and medically sound?” says Rachel Fortune, consulting eating-disorder physician for Newport Healthcare, a network of mental-health treatment centers for teens and young adults. “TikTok is not where to find any of that.”
Other videos Naomi finds troubling are ones about recovering from an eating disorder.
Naomi says people often show photos of themselves at their thinnest and mention how much weight they lost. Naomi says those videos can spark competitiveness in teens like her—and sow doubts about how sick they themselves really were, since they didn’t lose as much weight as the people in the videos.
“Those are really hard for me to see,” she says. TikTok says it allows content on its site about eating-disorder recovery and support.
Within Health found that even partial searches in TikTok’s app can give people ideas for other diet-related terms, making it easier to find potentially harmful content.Photo: Within Health
‘No perfectly safe environment’
Lauren Smolar, vice president of mission and education at NEDA, says TikTok has helped users by filtering search terms and redirecting people to her organization. So far this year, she says, more than 5,000 people have contacted NEDA’s helpline because of the messages they had seen on TikTok.
Still, she says, social-media companies must keep evolving to block dangerous diet-related content. “There’s no perfectly safe environment at this point,” Ms. Smolar says.
Dr. Fortune says one of her patients put a lot of work into retraining her TikTok recommendations and repopulating her For You page. The young woman unfollowed diet-related accounts, then followed body-positive creators and liked their videos, indicating that she preferred such content.
“You have to make a conscious effort to run away from it and actively reject the content,” she says.
Naomi says the pervasiveness of diet culture makes it hard to avoid harmful eating content on social media—and to think rationally. “I’ll see a video of a very skinny person and it sets something off in my brain,” she says. “I get jealous.”
Naomi has been in inpatient treatment at a Newport Healthcare program twice for prescription-drug abuse and self-harm, and her restrictive dieting has been an underlying issue.
When she completed her last stay in May, she says, her mom didn’t allow her to be on social media. She says she felt disconnected from friends.
“Going completely off social media isn’t the answer for most people,” Naomi says.
After a few months of sticking to her recovery plan—which involved avoiding drugs, eating three meals a day and doing some form of daily physical activity—she convinced her mom to let her back on social media. Naomi says that this time she unfollowed everyone she didn’t talk to regularly, as well as celebrities she’d wanted to look like. She followed accounts related to self-help, art and poetry to spur more positive recommendations.
Her mom, Julea Ivancovich, says she wishes social media wasn’t such a big part of teens’ social lives.
“I’m battling every other influence she has in her life. She knows I love her, but everywhere she’s seeing other sources telling her she could look better if she lost weight,” Ms. Ivancovich says. “I’ve watched social media take away her confidence and become something she gets lost in.”
Article originally published by the Wall Street Journal.