
Two months ago, my grandmother went in for what was supposed to be a routine eye surgery. Her lens had shifted after a fall, and the doctors warned us that if the operation didn’t go well, she could lose her vision. The surgery was successful, and her sight came back—but during her recovery, I kept asking myself: what if it hadn’t worked? What if she had woken up into a world she couldn’t see, while her children and grandchildren were all busy with work and studies, unable to guide her every time she needed help? That fear made me think about millions of blind and low‑vision people who face similar challenges every day. White canes and human guides help, but they don’t always explain what’s around you, how the environment is changing, or where the safe path is in real time. Out of that experience, our team built Smart‑Pair: AI‑powered smart glasses that connect to a real‑time navigation backend and act like a calm, always‑present companion. The glasses stream video to our system, where a vision‑language model interprets the scene—doors, stairs, signs, obstacles—and generates structured guidance like “walk straight three meters, door on your right, chair blocking the left side.” We store scene embeddings in a Qdrant vector database so Smart‑Pair can recognize familiar locations such as ‘My kitchen’ or ‘Clinic entrance’ and say, “You’re back near your study desk,” instead of treating every corridor as new. Text‑to‑speech then delivers clear, natural instructions in the user’s preferred language, while WebRTC keeps the video and guidance loop low‑latency. For me, this project began with one scary moment in my own family. As a team, we turned that into a mission: to build the kind of smart glasses that could have supported my grandmother if her surgery had gone differently—and that today can help blind and low‑vision people move through the world with more confidence and dignity.- "https://oaisis.streamlit.app/"
19 Nov 2025