BrailleFly VLM-Powered Navigation for the Blinds

Created by team BrailleFly on November 30, 2024

BrailleFly revolutionizes accessibility with AI-powered smart glasses, designed for blind and visually impaired individuals. While the core product ensures safety with offline capabilities like obstacle detection and facial recognition, we are testing advanced Visual Language Model (VLM) functionalities during this hackathon. Leveraging VLM (e.g., Llama 3.2), our glasses provide real-time obstacle recognition and enhanced navigation features, transforming the way users interact with their surroundings. This testbed explores VLM’s potential to deliver scene understanding and seamless waypoint navigation via intuitive gestures and voice commands. By balancing cutting-edge offline safety functions with online features, BrailleFly creates an inclusive and adaptive experience for users worldwide.

Category tags:

"demo should either be a video of the functionality or a deployed webite"

avatar

Okti AIML API

Head of DevRel

"This is impact. I've given you top scores on everything even though you did not specify who is paying and how much. Roadmap needs re-visiting and you should aim higher... raise €20mln instead of €200k, but at least you thought about it so you are in the right direction"

avatar

"Assistance for blinds is a great direction for llama3.2 vision capabilities. Please keep pushing and continue exploring this area!"

avatar

Aleksei Naumov

Lead AI Product Engineer