TikTok Beats YouTube in Average Watch Time 0:43
(CNN) – TikTok is rolling out new resources to support the well-being of its hundreds of millions of users, the majority of whom are teens and young adults.
The resources include in-app guides that address topics such as “signs of trouble,” “steps to connect,” and tips on eating and bodily concerns to help people who are dealing with health problems. mental health.
“We are proud that our platform has become a place where people can share their personal experiences with mental wellness, find community and support each other, and we take our responsibility to keep TikTok as a safe space for these very seriously. important conversations, “said Tara Wadhwa, TikTok’s director of US policy, in the Sept. 14 announcement.
TikTok has also further developed its search engine interventions. Search for words or phrases like “suicide” and you will find information on local support resources that offer guidance on treatment options. If you choose to view search results, you will generally see educational or supportive content about suicide, rather than potentially dangerous TikToks.
“It’s a great idea and pretty much everything a platform can do, because other than that, we went into more extreme content censorship and maybe even … monetizing things that are meant to be useful resources,” said Mike C Parent, a psychologist and associate professor in the department of educational psychology at the University of Texas at Austin.
“It is important to talk about suicide and eating disorders and not delete that content,” he added. Some people might think that talking about suicide will make teens more inclined to try, Parent said, but that’s not always the case.
The TikTok announcement comes in the wake of a Wall Street Journal (WSJ) story alleging that Facebook publicly downplayed the effects of Instagram on adolescent mental health, although Facebook’s own investigation reportedly revealed serious negative impacts.
“While the (WSJ) story focuses on a limited set of findings and casts them in a negative light, we support this research,” Karina Newton, Instagram’s head of public policy, said in a statement. “It shows our commitment to understanding the complex and difficult problems that young people can struggle with, and it informs all the work we do to help those who experience these problems. The question on many people’s minds is whether social media is good or not. bad for people. The research on this is mixed; it can be both. “
Instagram has had tools, such as a “Get Support” message directing users to helplines and other tips, aimed at helping people struggling with mental health issues for some time, according to a company spokesperson.
Pros and cons of changes
Chicago-based psychologist John Duffy said by email that many of his young clients say they initially learn about depression, anxiety, attention problems and eating disorders on TikTok.
“I’ve seen some of these videos, some from other children, and some posted by professionals, and many of them are accurate, informative, and quite helpful,” said Duffy, who works with teens, parents, couples, and families and wrote “Parenting the New Teen in the Age of Anxiety “. “I’m glad that good mental health-related information is available to our young people on a platform that engages them.”
“That said, these changes are not enough,” Duffy added. Teens often rely on TikTok content to diagnose and treat themselves, which can be dangerous without adult supervision. “It is critical that TikTok make clear that its platform is not a substitute for direct mental health care.” At the bottom of their “Wellness Guide”, TikTok states that the guides are for informational and educational purposes only and are not “intended to provide medical or mental health services.”
Additionally, the guides, which were created with the help of expert organizations such as Crisis Text Line, the International Association for Suicide Prevention, Live for Tomorrow, Samaritans of Singapore, Samaritans (UK) and the National Association for Eating Disorders , exist in the TikTok Security Center.
To access them, users must go to their profile, click on the menu icon in the upper right corner, and then scroll down to the “Support” section, where they can click on “Security Center”.
“The general thing with the user experience is that everything important should be one click or less away from the user,” Parent said. “Putting it with several clicks forms a barrier.”
TikTok already had warning labels and opt-out screens on videos with sensitive or distressing content. The company is expanding on that by applying the warnings to search results as well, for phrases like “scary makeup.”
Offering warnings of trigger sensitive material is courteous in certain settings, but research has shown that these warnings can be a double-edged sword, as people can sometimes increasingly incorporate the trauma into their identity rather than view the content. trigger as something to process to become healthier, Parent said.
“It ends up maybe in what, in psychology, we would call safe behavior, which sounds nice, but it’s not really healthy,” Parent said. “It’s like a person who is afraid of spiders and you never show him a picture of spiders; well, he will never stop being afraid of spiders.”
Tips for parents and teens
It’s important to have open communication with teens before problems arise, Parent said. If parents suspect that their teens’ use of social media is damaging their mental health, they should not punish them, he added; instead, they should hire a mental health professional who can mediate those conversations in ways that can better resonate with young people.
Also, remember that problematic social media use related to issues such as body image may be more of a symptom than the cause of the adolescent’s problem, according to Parent.
“Body image concerns existed long before social media. And now we live in an era where you can go to social media and find images of people as role models that you’ve never seen before,” Parent said. “Before, they were a bunch of white people and maybe Tyra Banks, and they were all skinny and had a particular look.”
Parents should become familiar with social media platforms, especially mental health related items, and talk openly with their teens about what they are witnessing, Duffy said.
For teens, being aware of common social media hoax, such as cosmetic filters and edits, and changing your social media consumption can help.
If you live in the U.S. and have suicidal thoughts, you can call 1-800-273-8255 to reach the National Suicide Prevention Line, which provides free and confidential support 24 hours a day, 7 days a week. week for people in crisis or suicidal distress. You can learn more about their services here, including their guide on what to do if you see suicidal language on social media. You can also call 1-800-273-8255 to speak with someone about how you can help someone in crisis. For crisis assistance in Spanish, call 1-888-628-9454.
If you live outside of the US, the International Association for Suicide Prevention provides a global directory of international resources and hotlines. You can also turn to Befrienders Worldwide.
Enter here to see where to seek help in Latin American countries and Spain.