TikTok expands mental health resources, as negative reports of Instagram’s effect on teens leak

TikTok announced this morning that it is implementing new tactics to educate its users about the negative mental health impacts of social media. As part of these changes, TikTok is rolling out a “well-being guide” in its Safety Center, a brief primer on eating disorders, expanded search interventions, and opt-in viewing screens on potentially triggering […]

TikTok announced this morning that it is implementing new tactics to educate its users about the negative mental health impacts of social media. As part of these changes, TikTok is rolling out a “well-being guide” in its Safety Center, a brief primer on eating disorders, expanded search interventions, and opt-in viewing screens on potentially triggering searches.

Developed in collaboration with International Association for Suicide PreventionCrisis Text LineLive For TomorrowSamaritans of Singapore, and Samaritans (UK), the new well-being guide offers more targeted advice toward people using TikTok, encouraging users to consider how it might impact them to share their mental health stories on a platform where any post has the potential to go viral. TikTok wants users to think about why they’re sharing their experience, if they’re ready for a wider audience to hear their story if sharing could be harmful to them, and if they’re prepared to hear others’ stories in response.

The platform also added a brief, albeit generic memo about the impact of eating disorders under the “topics” section of the Safety Center, which was developed with the National Eating Disorders Association (NEDA). NEDA has a long track record of collaborating with social media platforms, most recently working with Pinterest to prohibit ads promoting weight loss.

Pinterest updates policy to prohibit ads promoting weight loss

Already, TikTok directs users to local resources when they search for words or phrases like #suicide,* but now, the platform will also share content from creators with the intent of helping someone in need. The platform told TechCrunch that it chose this content following consultation with independent experts. Additionally, if someone enters a search phrase that might be alarming (TikTok offered “scary makeup” as an example), the content will be blurred out, asking users to opt-in to see the search results.

As TikTok unveils these changes, its competitor Instagram is facing scrutiny after The Wall Street Journal leaked documents that reveal its parent company Facebook’s own research on the harm Instagram poses for teen girls. Similar to the Gen Z-dominated TikTok, more than 40% of Instagram users are 22 or younger, and 22 million teens log into Instagram in the U.S. each day. In one anecdote, a 19-year-old interviewed by The Wall Street Journal said that after searching Instagram for workout ideas, her explore page has been flooded with photos about how to lose weight (Instagram has previously fessed up to errors with its search function, which recommended that users search topics like “fasting” and “appetite suppressants”). Angela Guarda, director for the eating-disorders program at Johns Hopkins Hospital, told The Wall Street Journal that her patients often say they learned about dangerous weight loss tactics via social media.

“The question on many people’s minds is if social media is good or bad for people. The research on this is mixed; it can be both,” Instagram wrote in a blog post today.

As TikTok nods to with its advice on sharing mental health stories, social media can often be a positive resource, allowing people who are dealing with certain challenges to learn from others who have gone through similar experiences. So, despite these platforms’ outsized influence, it’s also on real people to think twice about what they post and how it might influence others. Even when Facebook experimented with hiding the number of “likes” on Instagram, employees said that it didn’t improve overall user well-being. These revelations about the negative impact of social media on mental health and body image aren’t ground-breaking, but they generate a renewed pressure for these powerful platforms to think about how to support their users (or, at the very least, add some new memos to their security center). 

*If you or someone you know is struggling with depression or has had thoughts of harming themselves or taking their own life, The National Suicide Prevention Lifeline (1-800-273-8255) provides 24/7, free, confidential support for people in distress, as well as best practices for professionals and resources to aid in prevention and crisis situations.

Data & News supplied by www.cloudquote.io
Stock quotes supplied by Barchart
Quotes delayed at least 20 minutes.
By accessing this page, you agree to the following
Privacy Policy and Terms and Conditions.