
XIAO Field Copilot is a focused multimodal support assistant for Seeed Studio XIAO edge hardware. Instead of building a full wiki-scale RAG system, the project narrows the problem to a practical field workflow: upload a photo of a XIAO board, ask a hardware question, and receive an answer grounded in curated documentation. The copilot supports board identification, troubleshooting, pin and wiring lookup, board comparison, and setup guidance. It uses a XIAO-only corpus covering common boards such as XIAO ESP32S3, ESP32C3, ESP32C6, RP2040, nRF52840, and SAMD21. Each answer includes source-backed citations so users can verify the guidance against official Seeed documentation. The system is intentionally lightweight for a hackathon demo. A Gradio app on Hugging Face Spaces connects to hosted vLLM endpoints: a Qwen3-VL embedding model for multimodal image/text retrieval, an optional multimodal reranker for grounding, and a small Qwen-style agent model for concise support responses. A simple agent router decides whether the user is trying to identify a board, troubleshoot a symptom, compare devices, or find wiring/setup instructions. The AMD angle is that the model endpoints are designed to run on AMD Developer Cloud with ROCm, using AMD Instinct hardware to keep the multimodal retrieval and generation stack available as hosted services. The final product demonstrates how AMD GPUs can power practical edge-device support workflows without requiring the end user to run models locally.
10 May 2026