

AIYA is a multimodal AI assistant for smart glasses that delivers real-time intelligence โ vision, language, and speech โ directly in your field of view, hands-free. The Solution AIYA addresses these challenges through a suite of specialized AI agents embedded within smart glasses. Each agent is purpose-built for a specific use case, working together as an orchestrated system: AIYA runs a suite of specialized AI agents on smart glasses, each purpose-built and working together: - Vision Agent โ analyzes live frames via GPT-4o/Gemini for scene understanding and object detection. - Translation Agent โ OCR detects foreign text and overlays translations instantly on the lens. - Chat Assistant โ voice-driven AI guidance, hands-free, no device needed. - Navigation Agent โ real-time AR turn-by-turn directions overlaid in view. - Safety & Hazard Agent โ monitors for warning signs and hazardous zones, alerting the wearer immediately. Technology Stack AIYA is built on a modular architecture: webcam/glass feed captured via WebRTC, frame analysis via GPT-4o Vision or Gemini, OCR powered by Tesseract.js, gesture and object detection via MediaPipe, avatar-driven responses via D-ID, and AR overlays rendered on-device. The system is designed to be lightweight, low-latency, and deployable on Epson Moverio or Meta RayBan smart glasses. Why AIYA Wins AIYA is not a single-purpose tool โ it is an extensible AI agent platform for the physical world. It reduces language barriers, improves safety outcomes, accelerates workforce productivity, and delivers personalized, context-aware intelligence exactly when and where it is needed. The future of AI is not on a screen โ it is in your field of view.
2 Mar 2026

CareSight AI is an autonomous, multimodal healthcare assistant designed to reduce clinical paperwork and improve care readiness across the patient journey. The system uses computer vision, OCR, speech recognition, and LLM reasoning (Gemma 3N) to perceive and structure unorganized healthcare data such as documents, images, and spoken responses. Patients begin with a mobile web app to upload IDs, insurance cards, lab reports, and clinical images, or provide voice responses. CareSight AI extracts structured data, validates completeness, and prepares an intake summary. During nurse check-in, the system assists by auto-filling forms, flagging inconsistencies, and supporting preliminary screenings such as cognitive memory tests by recording sessions, transcribing responses, and automatically scoring recall accuracy. Doctors receive a concise, explainable summary highlighting key findings and uncertainties. CareSight AI functions as an autonomous agent following a perceptionโreasoningโaction loop, supporting clinicians without replacing them and enabling scalable, remote-ready healthcare workflows.
15 Feb 2026