TikTok Adds New Resources to Provide Additional Pathways to Support for Mental Health Concerns
First off, the platform is expanding its well-being guides to support people who choose to share their personal experiences on the platform.
As explained by TikTok:
“While we don’t allow content that promotes, glorifies or normalizes suicide, self-harm or eating disorders, we do support people who choose to share their experiences to raise awareness, and help others who might be struggling and find support among our community. To better facilitate such, we’ve developed new well-being guides to support people who choose to share their personal experiences, with the guidance of the International Association for Suicide Prevention, Crisis Text Line, Live For Tomorrow, Samaritans of Singapore and Samaritans (UK).”
The new guides, now available in TikTok’s Safety Center, offer tips to help users communicate their experiences, as well as how to responsibly engage with others who may be struggling or in distress.
In addition to this, TikTok’s also highlighting a new set of curated content from its partner organizations in-app, providing more information, up-front, about important well-being issues.
The new programming is currently live, and will be running through September 16th.
TikTok’s also expanding its in-search prompts when users enter eating disorder related queries, which will help to guide them towards professional support tools and resources.
“We’ve added a new Safety Center guide on eating disorders for teens, caregivers, and educators. Developed in consultation with independent experts including the National Eating Disorders Association (NEDA), National Eating Disorder Information Centre, Butterfly Foundation, and Bodywhys, this guide will provide information, support and advice on eating disorders.”
TikTok’s also adding similar to suicide and self-harm related searches, with links directing them to local support resources and options.
And finally, TikTok’s also updating its warning labels for sensitive content.
“Beginning in September, when a user searches for terms that may bring up content that some may find distressing, for example ‘scary make-up’, the search results page will be covered with an opt-in viewing screen. Individuals will be able to click ‘Show results’ to continue to see the content.”
These opt-in viewing screens already appear on top of videos that some may find graphic or distressing, while this type of content is also ineligible for recommendation into anyone’s For You feed.
TikTok has gained major traction among younger audiences over the last few years, and continues to add more users – and within that, there’s an obligation on the platform to protect these more impressionable users, where it can, and both shield them from harm, while also providing support.
TikTok has faced various challenges on this front. Last year, the app was temporarily banned in Italy following the death of a young girl participating in an in-app challenge, while TikTok has also been criticized for its exploitation of young girls, with its highly attuned algorithms seemingly aligned to personal traits and content that may appeal to predators.
Much like Instagram, the visual nature of the platform can easily lead to mental health impacts, and as such, TikTok needs to facilitate support to users in need, as best it can, with more resources, more in-app tools, and community connection.
There’s no way to address such issues in their entirety, but it’s good to see TikTok continuing to add new tools on this front.
If you liked the article, do not forget to share it with your friends. Follow us on Google News too, click on the star and choose us from your favorites.
For forums sites go to Forum.BuradaBiliyorum.Com
If you want to read more Like this articles, you can visit our Social Media category.