.png&w=828&q=75)
Sleepy Mini – Robonormal Activity reimagines sleep monitoring by combining Reachy Mini's spatial awareness with wearable biometrics and LLM-powered reasoning. Reachy Mini's 3-DOF head (roll/pitch/yaw) and look_at() gaze control give it continuous spatial awareness of the crib environment — tracking infant position, detecting movement zones, and orienting its camera feed toward areas of activity. This spatial telemetry is structured alongside FHIR R4 health observations (heart rate, respiratory rate, SpO₂, sleep staging, movement actigraphy) sourced from a SOM armband and Apple Watch via Apple HealthKit. The key insight: all of this data — real-time vitals, sleep stage classifications, movement vectors, spatial gaze coordinates, and device state — is serialized into structured FHIR bundles and streamed into an LLM's context window. The model interprets the multimodal signal holistically: correlating a spike in heart rate with increased crib movement and Reachy's gaze shift to infer a restless transition, or recognizing sustained deep-sleep vitals with minimal spatial activity to confirm consolidated rest. When the LLM detects actionable patterns, Reachy responds physically — gentle antenna wiggles, a reassuring head nod, or a soft chime from its torso speaker via the audio API. The smart alarm uses this same loop: as the target wake window approaches, the LLM identifies the lightest sleep stage from the FHIR data and triggers a graduated wake sequence (antenna motion → head movement → sound) timed to the optimal moment. Built for Track 3: Robotic Interaction and Task Execution (Simulation-First). All Reachy SDK commands (head.goto(), head.look_at(), l_antenna.goto(), audio.play_audio_file()) are simulated in-browser with real API signatures, ready for deployment on physical hardware. FHIR R4 bundles with LOINC codes ensure Apple Health compatibility across the data pipeline.
15 Feb 2026