MUSCAT – This project, titled ‘Smart Eye for Visually Impaired and Real Time Sign Language Translator’, aims to provide affordable and accessible assistive technology for visually impaired and deaf individuals in Oman, facilitating navigation and real-time Arabic sign language translation.
The research team, led by Mohamed Hamdy Eissa and funded by the Ministry’s Block Funding Program, has developed innovative AI-powered smart glasses and smart gloves to support visually impaired and deaf individuals, marking a significant step toward inclusive and affordable assistive technology in Oman.
The project, titled ‘Smart Eye for Visually Impaired and Real-Time Sign Language Translator’, aims to assist visually impaired individuals in navigation and recognition tasks using smart glasses, while enabling deaf individuals to communicate by translating Arabic sign language into text via smart gloves. It also implements a deep learning-based Arabic Sign Language Recognition system, providing a novel, portable, and affordable solution tailored to the needs of people with disabilities in Oman and the Arab world.
Read More
- Temporary decline in water levels at Khor Al Mughsail due to bridge construction, Oman’s EA clarifies
- Oman’s transport ministry clarifies procedures for navigation permits for foreign vessels
- His Majesty Sultan Haitham bestows Royal Commendation Order on key officials
- Oman records first successful pregnancy in woman with rare Klippel-Trenaunay Syndrome
- Economic and free zones in Oman cement their status as premier investment destinations
The system successfully recognised 32 Arabic sign language letters using a lightweight deep learning model (MobileNet-v2). Among these, the class ghain achieved the highest recognition performance with an F1-score of 0.948, while gaaf recorded the lowest at 0.535, highlighting the need for further dataset refinement. Overall, the model achieved a strong average F1-score of 0.808, demonstrating reliable classification performance. The use of transfer learning effectively addressed the challenge of limited dataset size, making the system suitable for real-world implementation. The research team recommends expanding the system to recognize full words and sentences, integrating voice feedback and text-to-speech features, improving smart glove ergonomics, and collaborating with local disability associations to refine the technology based on user feedback. Future plans include exploring multilingual sign language support to extend the impact across the region. The project team, which includes Dr. Alaa Ismaeel, Dr. Sherimon P.C., Dr. Vinu Sherimon, and Dr. Remya Revi K., worked alongside Eissa to bring this groundbreaking assistive technology to life in Oman.,





