AI Model Is 98 Percent Accurate in Detecting Diseases Just by Looking at Patients’ Tongues
An indigo or violet hue could signal vascular issues, gastrointestinal challenges, or asthma.
In a groundbreaking development, researchers have unveiled a computer algorithm capable of analyzing tongue color to diagnose medical conditions with remarkable accuracy — boasting a 98 percent success rate.
The study, led by an esteemed academic from both Middle Technical University at Baghdad and the University of South Australia, Ali Al-Naji, highlights the diagnostic potential of the AI-driven technology.
“For instance, diabetes patients often show a yellow tongue, while those battling cancer might present a purple tongue with a greasy layer. Acute stroke patients often have a distinctly shaped red tongue,” Mr. Al-Naji told the Hindustan Times.
The research specifies that a white tongue might indicate anemia, while severe Covid-19 cases typically display a deep-red tongue. An indigo or violet hue could signal vascular issues, gastrointestinal challenges, or asthma.
Drawing inspiration from traditional Chinese medicine, which uses tongue examination as a diagnostic tool, the researchers integrated more than 5,200 images to train their AI model. The system was then tested on 60 tongue images sourced from two medical institutions in the Middle East.
Participants were asked to sit approximately eight inches away from a laptop computer equipped with a webcam to capture their tongues’ images. The algorithm accurately identified the health conditions in nearly every case, according to the results documented in the journal Technologies.
The app could potentially diagnose a wide array of conditions, including diabetes, stroke, anemia, asthma, liver and gallbladder issues, and Covid-19.