
1. The Problem: Philippine Disaster Response Lag The Philippines is the world's most disaster-prone nation. During events like Typhoon Carina, the gap between satellite data acquisition and actionable ground intelligence is measured in hours—hours that cost lives. Existing systems lack the multimodal intelligence required to interpret complex geospatial data into localized, bilingual directives for responders. 2. The Solution: Project ARK Project ARK is a defense-grade geospatial protocol optimized for the AMD MI300X. It serves as a sovereign intelligence layer that stays operational even when global cloud infrastructures fail. ARK transforms raw satellite telemetry into high-fidelity disaster reports, providing tactical damage assessment, rescue routing, and resource allocation in English and Tagalog. 3. How I Built It: Triple-Track Integration Leveraging the massive memory bandwidth of the MI300X, ARK executes a unified pipeline across three hackathon tracks: 🤖 Track 1: AI Agents & Agentic Workflows I engineered LangGraph and CrewAI. This system automates the workflow from ingestion to reporting, coordinating six specialized agents: QA Node, Damage Assessment, Economic Valuation, Insurance Risk, Recovery Planning, and NDRRMC Officer. ⚡ Track 2: Fine-Tuning on AMD GPUs For domain-specific accuracy, I performed LoRA fine-tuning on the NASA/IBM Prithvi-100M geospatial model and Qwen-VL-7B. This allows for zero-shot flood segmentation and damage assessment, highly specialized for the Philippine terrain and infrastructure. 🎨 Track 3: Vision & Multimodal AI ARK utilizes the 192GB HBM3 memory of the MI300X for high-throughput multimodal processing. The system ingests multi-spectral imagery and outputs visual-language interpretations. By combining Prithvi (geospatial vision) with Qwen-VL (multimodal agents), ARK provides an "eye-in-the-sky" that understands both visual wreckage and the logical steps for extraction and relief.
10 May 2026