Rephrase the title:MoodCapture: This AI-Powered Phone App Can Detect Depression From Facial Expressions

Rephrase and rearrange the whole content into a news article. I want you to respond only in language English. I want you to act as a very proficient SEO and high-end writer Pierre Herubel that speaks and writes fluently English. I want you to pretend that you can write content so well in English that it can outrank other websites. Make sure there is zero plagiarism.:

A team of researchers at New Hampshire’s Dartmouth College has developed an AI-powered phone application that could detect early signs of depression through facial expressions. 

AI-Powered MoodCapture

Called MoodCapture, this app utilizes a smartphone’s front camera to capture users’ facial expressions and surroundings during regular use, and an AI-based algorithm will analyze the images for indicators associated with depression. 

In a study involving 177 individuals diagnosed with major depressive disorder, MoodCapture was reported to have reached an impressive 75% accuracy rate in identifying early symptoms of depression.

Man Phone Mobile
(Photo : Sammy-Sander from Pixabay)

Andrew Campbell, the study’s corresponding author and Dartmouth’s Albert Bradley 1915 Third Century Professor of Computer Science, emphasized the groundbreaking nature of the technology, highlighting its potential to predict mood changes in individuals diagnosed with major depression reliably and non-intrusively.

“This is the first time that natural ‘in-the-wild’ images have been used to predict depression. There’s been a movement for digital mental-health technology to ultimately come up with a tool that can predict mood in people diagnosed with major depression in a reliable and nonintrusive way,” Campbell said in a press statement.

Campbell, whose phone recently showed he had done so over 800 times in one week, added that a “person just unlocks their phone and MoodCapture knows their depression dynamics and can suggest they seek help.”

How MoodCapture Works

MoodCapture, using a similar technology pipeline of facial recognition system with deep learning and AI hardware, analyzes the images captured the phone’s camera in real-time to see if there are any signs that you might be beginning to feel depressed.

Over time, the app learns to identify specific features unique to each user, providing personalized insights into their mental well-being. During the study, participants consented to have their photos taken via their phone’s front camera MoodCapture but did not know when it was happening.

The application’s predictive model was trained using image-analysis AI, enabling it to learn to correlate self-reports of feeling depressed with specific facial expressions and environmental features.

The research team anticipates that MoodCapture could become a powerful tool for evaluating individuals’ moods passively, offering valuable insights for therapeutic intervention.

Read Also: Does Social Media Really Cause Depression in Children? New Study Has an Answer

Proof of Concept

Nicholas Jacobson, the study’s co-author and assistant professor of biomedical data science and psychiatry at Dartmouth’s Center for Technology and Behavioral Health, emphasized the importance of capturing the dynamic nature of depression symptoms in real time to enable timely intervention and treatment. 

Jacobson envisioned MoodCapture as a potential solution to bridge the gap between individuals needing mental health support and the accessibility of resources. 

By providing real-time insights into users’ emotional states, the app aims to offer support and minimize the impact of depression on individuals’ lives. Looking ahead, the researchers plan to refine MoodCapture’s diagnostic ability, enhance privacy measures, and expand its knowledge base to improve accuracy.

“You wouldn’t need to start from scratch-we know the general model is 75% accurate, so a specific person’s data could be used to fine-tune the model. Devices within the next few years should easily be able to handle this,” said Subigya Nepal, a Guarini School of Graduate and Advanced Studies PhD candidate in Campbell’s research group.

“We know that facial expressions are indicative of emotional state. Our study is a proof of concept that when it comes to using technology to evaluate mental health, they’re one of the most important signals we can get,” Nepal added.

The research team’s findings were published on arXiv. 

Related Article: ‘Girl Dinner’ TikTok Eating Trend: Health Experts React 

Byline

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.

Related Post