ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.16475
36
0

LLM-Glasses: GenAI-driven Glasses with Haptic Feedback for Navigation of Visually Impaired People

4 March 2025
Issatay Tokmurziyev
Miguel Altamirano Cabrera
Muhammad Haris Khan
Yara Mahmoud
Luis Moreno
Dzmitry Tsetserukou
ArXivPDFHTML
Abstract

We present LLM-Glasses, a wearable navigation system designed to assist visually impaired individuals by combining haptic feedback, YOLO-World object detection, and GPT-4o-driven reasoning. The system delivers real-time tactile guidance via temple-mounted actuators, enabling intuitive and independent navigation. Three user studies were conducted to evaluate its effectiveness: (1) a haptic pattern recognition study achieving an 81.3% average recognition rate across 13 distinct patterns, (2) a VICON-based navigation study in which participants successfully followed predefined paths in open spaces, and (3) an LLM-guided video evaluation demonstrating 91.8% accuracy in open scenarios, 84.6% with static obstacles, and 81.5% with dynamic obstacles. These results demonstrate the system's reliability in controlled environments, with ongoing work focusing on refining its responsiveness and adaptability to diverse real-world scenarios. LLM-Glasses showcases the potential of combining generative AI with haptic interfaces to empower visually impaired individuals with intuitive and effective mobility solutions.

View on arXiv
@article{tokmurziyev2025_2503.16475,
  title={ LLM-Glasses: GenAI-driven Glasses with Haptic Feedback for Navigation of Visually Impaired People },
  author={ Issatay Tokmurziyev and Miguel Altamirano Cabrera and Muhammad Haris Khan and Yara Mahmoud and Luis Moreno and Dzmitry Tsetserukou },
  journal={arXiv preprint arXiv:2503.16475},
  year={ 2025 }
}
Comments on this paper