A team from the University of South Florida has created an AI tool that studies children’s facial expressions to help spot signs of PTSD while keeping their identities private.
This study is one of the first to combine emotional behavior analysis with strict privacy protection. The system avoids storing personal identity details and focuses only on facial movements like eye direction, mouth position, head tilt, and expressions.
Why Diagnosing PTSD in Children Is Challenging
PTSD is a mental health issue that can happen after a person goes through a very upsetting or stressful experience. While adults can often describe their feelings, children may struggle due to:
• Limited communication skills
• Low emotional awareness
• Difficulty understanding or expressing feelings
Traditionally, PTSD in kids is diagnosed using interviews, questionnaires, or discussions, but many cases go unnoticed because children can’t always explain what they feel.
How the New AI System Works
The idea began when Dr. Alison Salloum, a social work professor at USF and expert in child trauma, observed intense emotional reactions during trauma-focused interviews. She partnered with Dr. Shaun Canavan, an AI researcher, to find a better way to track those reactions without making children uncomfortable.
Dr. Canavan built an AI model trained on over 180,000 video frames per child, taken from therapy sessions. The system focuses on micro-expressions and subtle facial cues that may indicate stress or trauma—like changes in eye movement, facial muscle tension, or gaze patterns.
Importantly, the system was designed to blur or ignore identity details to ensure complete confidentiality during analysis.
A Support Tool for Mental Health Professionals
Dr. Salloum stressed that this AI tool is not meant to replace therapists or psychologists. Instead, it acts as a support system, giving professionals real-time feedback during sessions and helping them track a child’s emotional progress more accurately, without the need for repeated, emotionally intense interviews.
Ethical Approach and Future Use
Dr. Canavan emphasized that it’s important to follow ethical practices when doing research with children. He explained that the team took extra steps to protect the young participants and ensure the data collection was respectful and secure.
The researchers also found that children tend to show more facial responses when talking to clinicians than when speaking with their parents. This might be due to discomfort, embarrassment, or reluctance to share certain feelings with family members.
Currently, the team is working to remove biases related to gender, age, and culture from the AI system. This is especially important for use in younger children, like preschoolers.
In the future, the same AI technology could help detect other mental health conditions in children, such as
• Anxiety
• Depression
• ADHD
Final Thoughts
This new AI tool offers a promising way to help professionals identify PTSD in children more quickly and sensitively, while also respecting their emotional well-being and privacy. It’s a big step forward in combining technology and mental health care, especially for young and vulnerable individuals.