Researchers have developed an eye-tracking tool which they claim offers a new way to diagnose neurological vestibular conditions.
The platform allows patients to record their eye movements using a smartphone and its developers say results can be easily interpreted by audiologists and other clinicians during virtual consultations.
The researchers from Florida Atlantic University (FAU) and collaborators developed a novel proof-of-concept deep learning model that leverages real-time data to assist in diagnosing nystagmus. Nystagmus is a condition characterised by involuntary, rhythmic eye movements often linked to vestibular or neurological disorders.
They claim the system offers a cost-effective, patient-friendly, quick and reliable screening for balance disorders and abnormal eye movements.
The platform allows patients to record their eye movements using a smartphone, securely upload the video to a cloud-based system, and receive remote diagnostic analysis from vestibular and balance experts – all without leaving their home.
At the heart of the innovation is a deep learning framework that uses real-time facial landmark tracking to analyse eye movements.
AI maps facial landmarks
An AI system automatically maps 468 facial landmarks and evaluates slow-phase velocity – a key metric for identifying nystagmus intensity, duration and direction. It then generates intuitive graphs and reports that can easily be interpreted by audiologists and other clinicians during virtual consultations.
The researchers said results of a pilot study involving 20 participants, published in Cureus (part of Springer Nature), demonstrated that the AI system’s assessments closely mirrored those obtained through traditional medical devices.
“This early success underscores the model’s accuracy and potential for clinical reliability, even in its initial stages,” they said.
“Our AI model offers a promising tool that can partially supplement – or, in some cases, replace – conventional diagnostic methods, especially in telehealth environments where access to specialised care is limited,” said Dr Ali Danesh, principal investigator and senior author of the study.
“By integrating deep learning, cloud computing and telemedicine, we’re making diagnosis more flexible, affordable and accessible – particularly for low-income rural and remote communities.”
Dr Danesh is a professor in the Department of Communication Sciences and Disorders in FAU’s College of Education and a professor of biomedical science in its Charles E Schmidt College of Medicine.
His team trained their algorithm on more than 15,000 video frames, using a structured 70:20:10 split for training, testing and validation to ensure the model’s robustness and adaptability across varied patient populations.
The AI also employs intelligent filtering to eliminate artifacts such as eye blinks, ensuring accurate and consistent readings.
Streamline clinical workflows
Apart from diagnostics, the system is also designed to streamline clinical workflows.
Physicians and audiologists can access AI-generated reports via telehealth platforms, compare them with patients’ electronic health records, and develop personalised treatment plans. Patients benefit from reduced travel, lower costs and the convenience of conducting follow-up assessments by simply uploading new videos from home – enabling clinicians to track disorder progression over time.
FAU researchers are also experimenting with a wearable headset equipped with deep learning capabilities to detect nystagmus in real-time. Early tests in controlled environments have shown promise, although they said improvements were still needed to address challenges such as sensor noise and variability among individual users.
“While still in its early stages, our technology holds the potential to transform care for patients with vestibular and neurological disorders,” said Dr Harshal Sanghvi, first author.
“With its ability to provide non-invasive, real-time analysis, our platform could be deployed widely – in clinics, emergency rooms, audiology centres and even at home.”
Dr Sanghvi is an FAU electrical engineering and computer science graduate, and a postdoctoral fellow at FAU’s College of Medicine and College of Business.