Meta, Snapchat and TikTok are teaming up on a new initiative that’s designed to help detect and remove suicide and self-harm-related content, in order to reduce exposure among at-risk users.
The new initiative, called “Thrive”, will be overseen by the Mental Health Coalition, with the three platforms sharing data on concerning content, which will then enable broader cross-platform action.
As explained by Meta:
“Through Thrive, participating tech companies will be able to share signals about violating suicide or self-harm content so that other companies can investigate and take action if the same or similar content is being shared on their platforms. Meta is providing the technical infrastructure that underpins Thrive – the same technology we provide to the Tech Coalition’s Lantern program – which enables signals to be shared securely.”
To be clear, all three apps allow users to discuss mental health concerns, and share their thoughts on such. But there are definitive rules around the distribution of graphic imagery, and/ or material which could encourage suicide or self-harm, which is the focus of the Thrive program.
The project will essentially see the three companies sharing data on such content, which will then enable broader, faster enforcement. This data, in the form of identifiable “hashes,” will ensure that it can be uncovered across each app, and addressed as required.
Meta notes that the data shared will only identify content, and will not include identifiable information about any accounts or individuals. That’ll then see such material removed faster, while also helping to build the respective databases and enforcement programs within each app.
Major social networks have also been working together on influence operations, sharing similar info to detect and remove coordinated initiatives designed to deceive and mislead users.
This sort of cross-platform collaboration can greatly improve response efforts, and it’s good to see the three companies working to expand their respective efforts to better protect users.
Increased social media usage has been linked to higher rates of youth depression and self-harm. And with suicide now being the second most common cause of death among American youth, it’s important that the platforms work to revise and improve their detection processes where possible, in order to keep users safe.
This is an important initiative in this respect, and will hopefully help to form a model for broader collaboration.