.png&w=256&q=75)
1
1
Brazil
5+ years of experience
I am Rafael Robson, a Technical Founder with a trajectory that began long before my formal education. I started programming at age 8 and began architecting AI solutions at 13. This decade of hands-on experience allows me to bridge the gap between complex business strategy and deep, low-level technical execution. On the technical side, I am currently architecting a proprietary SaaS ecosystem. This system utilizes LangGraph and LangChain to orchestrate multiple AI "brains" that automate complex Data Engineering pipelines under my supervision. This allows me to scale my output and refine high-level results with extreme efficiency. Technical Core & Stack: • Generative AI & LLMOps: Agent orchestration (LangChain/LangGraph), Fine-tuning, LLM Quantization, and Local Deployment. • Software Engineering: A polyglot approach including Go, Scala, Java, Python, SQL, and mission-critical systems in Cobol. • Data & Analytics: Analytics Engineering with dbt, MLOps, and Cloud Infrastructure. I am driven by the challenge of transforming dreams and complex code into market-shifting products. Whether it's sustainable mobility or generative AI, my focus is on building systems that generate real value and social impact.

RepoMind is a context-aware AI code review system built on a multi-agent architecture. Unlike traditional automated reviewers that operate only on the diff, RepoMind first indexes the entire repository using AST parsing (tree-sitter), vector embeddings, and a dependency graph, then retrieves semantically relevant context before reviewing each Pull Request. The review pipeline Indexer, Contextualizer, Reviewer, Critic, and Reporter is orchestrated with LangGraph and streams results live to the frontend. An interactive 3D knowledge graph (Three.js) visualizes the repository architecture and highlights exactly which files the RAG retrieved during each review, making the AI reasoning transparent and auditable. Built with privacy first: runs fully local via Ollama, so source code never leaves the developer's machine. Switching LLM providers (Ollama, AMD Developer Cloud, Groq) requires zero code changes just editing the .env file. Stack: FastAPI, React, LangGraph, ChromaDB, tree-sitter, NetworkX, Three.js.
10 May 2026