Meta, Snap, TikTok to remove self
- Meta, Snap, and TikTok have launched an initiative called Thrive to combat online content related to suicide and self-harm.
- The initiative will involve sharing signals about harmful content and utilizing technology to ensure safer online experiences for minors.
- This collaboration aims to address the rising mental health issues among young users and improve the response to harmful content across platforms.
On September 12, Meta announced a collaboration with Snap and TikTok to tackle online content related to suicide and self-harm. This initiative, named Thrive, aims to destigmatize mental health issues and mitigate the rapid spread of harmful content across social media platforms. The partnership is supported by The Mental Health Coalition, which comprises various mental health organizations focused on reducing stigma surrounding these issues. Meta's global head of safety, Antigone Davis, emphasized the importance of addressing this content due to its potential to spread quickly across different platforms. The initiative will utilize technology developed by Lantern, a company dedicated to creating safer online environments for minors. The participating companies will share signals regarding harmful content, allowing them to monitor and take action against similar posts on their respective platforms. The initiative will not target users but will focus on identifying and removing harmful content. When such content is detected, it will be assigned a unique identifier, or "hash," which can be used by other platforms to locate and eliminate it. This collaborative effort comes in response to rising concerns about the impact of social media on the mental health of young users, particularly as increased usage has been linked to higher rates of depression and self-harm among minors. Meta has previously announced plans to limit sensitive content for teenagers and hide search results related to suicide and self-harm. The Thrive initiative represents a significant step towards creating a safer online environment for young users and addressing the ongoing criticism faced by social media platforms regarding their handling of harmful content.