
Alice is a local-first AI sidecar designed to enable real-time system control through natural language. It runs entirely on-device, using a single orchestrator model to interpret intent, route actions, and execute commands with low latency. The system is built around a minimal interaction model: a small, always-available widget that listens on demand and responds instantly. There is no traditional interface, no menus, and no reliance on cloud infrastructure. Alice integrates lightweight context awareness, including active application state and optional screen understanding, allowing it to perform meaningful system-level actions such as controlling media, managing applications, and navigating workflows without interrupting the user. Under the hood, it uses a modular, provider-based architecture that separates reasoning, context gathering, and execution. This ensures reliability, scalability, and clean extensibility while maintaining performance on constrained hardware. The system is designed to prioritize: low-latency interaction local-first privacy intentional, user-driven control real-world usability over theoretical capability Built as a prototype for the AMD Developer Hackathon 2026, Alice demonstrates that effective AI systems do not require large-scale infrastructure. Instead, with the right architecture, even CPU-only environments can deliver responsive, practical intelligence.
10 May 2026