USF doctor enters final year of research to see if AI can detect vocal diseases

Three years into a project using artificial intelligence and voice to detect diseases and disorders, a University of South Florida laryngologist is ready for the next step in helping patients.

Voice is personal for Dr. Yael Bensoussan, who was a singer before joining the medical field and USF Health.

What they're saying:

"I had a disease of the vocal cord that’s called nodules, which I had to stop for a while and get treatment for. So, I’ve always been very passionate about the voice in general and what it can tell about our emotions, [and] our health," said Bensoussan, an assistant professor of otolaryngology at USF Health and co-principal investigator of the Bridge2AI Voice project.

PREVIOUS: USF becomes one of four universities with new AI voice course aimed at detecting certain health conditions

A USF doctor is in the final year of a four-year program to determine if AI can be used to detect vocal diseases.

Since 2022, Bensoussan has been studying how artificial intelligence can detect health conditions through voice.

"We work in the emergency department. It’s always very interesting to interview people in the emergency department. We work in outpatient clinics to build all those datasets," said Bensoussan.

Dig deeper:

The Bridge2AI Voice project is the focus of a three-day symposium, bringing together AI industry leaders and medical professionals in Tampa.

The symposium:

Image 1 of 4

 

"A few months ago, we had our first data release. We learned a lot from what people were doing with that data. Now, we have our second data release," said Bensoussan. "Hopefully, in the next month, we’ll learn a lot about the feedback that people are giving us."

Big picture view:

Doctors, clinicians and researchers are using her data to create models to diagnose patients for things like Alzheimer’s disease, heart disease or even mental health disorders.

"Can they use it? Is it hard to make discoveries? Is it hard to use? Can somebody with not a lot of knowledge about AI use it? Or do you have to have a lot of knowledge about AI to use it?" asked Bensoussan.

The National Institutes of Health has funded the four-year project. For her final year, Bensoussan said she will transition from a hospital setting to visiting patients in their own homes across the U.S. and Canada.

READ: Plane crashes after veering off runway, sparking brush fire in Myakka City

"A lot of people with chronic diseases, unfortunately, have to go to the hospital multiple times a month, and they never know when it’s time for them to go to the doctor, right?" said Bensoussan.

What's next:

Once doctors confirm it works, she said the next step is to get FDA approval, which will create real-life solutions for people’s health.

"That’s really the goal, right? We’re all developing tech. If we can’t implement it, then it doesn’t benefit the patient," said Bensoussan.

CLICK HERE:>>> Follow FOX 13 on YouTube

Bensoussan said some examples of those in-home solutions are smartwatches that can detect when you’re about to fall or when to go to the emergency room, all based on your voice.

USF Health and Weill Cornell Medicine worked to release the first clinically validated voice dataset from the NIH-funded "Voice as a Biomarker of Health" project. It featured over 12,500 recordings from 306 participants for use in health research.

The Source: FOX 13’s Briona Arradondo collected the information in this story.

WATCH FOX 13 NEWS LIVE: 

STAY CONNECTED WITH FOX 13 TAMPA:

TampaHealthArtificial IntelligenceUniversity of South Florida