
Credit: Unsplash/CC0 Public Domain
A new study published in Information Systems Research found that certain short-form videos on major platforms can trigger suicidal thoughts in vulnerable viewers, and a newly developed AI model can flag these high-risk videos before they go viral. This study provides one of the first data-driven, medically-informed tools to detect suicide-related harm in real-time, providing platforms with a clear early warning signal at a time of growing concern about youth mental health and increased scrutiny of platform safety.
The research was conducted by Jiaheng Xie from the University of Delaware, Yidong Chai from Hefei University of Science and Technology and City University of Hong Kong, Ruicheng Liang from Anhui University of Finance and Economics, Yang Liu from Hefei University of Science and Technology, and Daniel Dajun Zeng from the Chinese Academy of Sciences.
Their efforts come as the use of short-form videos is increasing at an incredible rate. 1.6 billion people around the world use short clips on TikTok, Douyin and similar platforms, but experts are warning about content that glamorizes or normalizes self-harm. Viewers often express their emotional distress directly in the comment sections of these videos, giving the platform a signal of harm in real time.
“Our goal was to help platforms understand when a video might trigger suicidal thoughts and catch that risk before it spreads,” Xie said. “The comments people leave are a strong indicator of how video content affects them, especially when viewers feel more willing to share how they’re feeling anonymously.”
The researchers developed a knowledge-guided neural topic model, a type of artificial intelligence that combines medical expertise about suicide risk factors with patterns found in real video content. The model predicts the likelihood that a new video will generate suicidal comments and allows the moderation team to intervene before the video reaches a wider audience.
Unlike existing methods that treat all videos and comments the same, this model distinguishes between what creators choose to post and what viewers think and feel after watching. It also separates known medical risk factors from emerging social media trends, such as viral breakup videos and challenges that can impact teens.
“Short-form videos often include a mix of personal stories, emotionally moving visuals, and strong themes,” says Chai. “By bringing medical knowledge directly into our AI models, we can more reliably detect harmful content and reveal it to human moderators when it matters most.”
The model outperformed other state-of-the-art tools to uncover medically relevant themes appearing in videos related to expressions of suicidal ideation. For platforms, this means automated systems can more accurately flag videos for follow-up with human reviewers, improving consistency and reducing the amount of content that needs to be manually evaluated.
The authors note that the model is designed to support, not replace, human judgment. They emphasize that moderation teams must continue to make final decisions based on platform policies, legal standards, and ethical considerations.
The findings provide practical guidance for platforms facing increased scrutiny over harm to the safety and mental health of teenagers. Against a backdrop of lawsuits, regulatory pressure and growing public concern, researchers say tools like theirs could help reduce preventable tragedies.
Further information: Jiaheng Xie et al., Short-form videos and mental health: A knowledge-guided neural topic model, Information Systems Research (2025). DOI: 10.1287/isre.2024.1071
Provided by Operations Research Management Science Institute
Citation: AI model predicts which short videos on major platforms are likely to cause suicidal thoughts (November 17, 2025) Retrieved November 18, 2025 from https://techxplore.com/news/2025-11-ai-short-videos-major-platforms.html
This document is subject to copyright. No part may be reproduced without written permission, except in fair dealing for personal study or research purposes. Content is provided for informational purposes only.
