
This project explores the possibility of agentic AI in the role of an autonomous Search And Rescue agent. With the help of the powerful AMD MI300X platform, we are able to deploy Qwen2.5-VL-72B-Instruct via vLLM as the vision-language engine that controls an Unmanned Aerial Vehicle to perform recon, search and secure given textual description of a rescue target. A Next.js-based dashboard was built to serve as the human-interaction interface to monitor the aircraft and the SAR agent at all times, while having the option to override and control the aircraft at any moment. For demonstration purposes, the aircraft and physical simulation is done in Gazebo Sim, with a PX4-powered drone controller and ROS nodes to enable inter-module communication. This project is a proof-of-concept of a fully autonomous and intelligent AI system delivering real-life impact in life-critical missions, exploring the possibility of "AI saving lives".
10 May 2026