TII UAE AI technology page Top Builders

Explore the top contributors showcasing the highest number of TII UAE AI technology page app submissions within our community.


The Technology Innovation Institute (TII) is a leading global research center based in Abu Dhabi, United Arab Emirates. TII focuses on pushing the frontiers of knowledge and delivering transformative technologies through its teams of scientists, researchers, and engineers. As part of the Abu Dhabi Government's Advanced Technology Research Council, TII serves as a catalyst for change and sets new standards in scientific research.

OrganizationTechnology Innovation Institute (TII)
LocationAbu Dhabi, United Arab Emirates
Area servedWorldwide


TII UAE is part of the Abu Dhabi Government's Advanced Technology Research Council, which oversees technology research in the emirate. As a disruptor in science, TII is setting new standards and serves as a catalyst for change.

Faced with a future of limitless possibilities and supported by strategically funded investments, TII encourages a culture of discovery. Their work reinforces Abu Dhabi and the UAE's status as an R&D hub and a global leader in breakthrough technologies.

Key Projects and Initiatives

TII UAE is involved in various projects and initiatives aimed at driving innovation and creating a better future. Some of their key projects include:

  • Falcon LLM: TII's flagship series of large language models, built from scratch using a custom data pipeline and distributed training library. Falcon LLM models are state-of-the-art for their size, outperforming most other models on NLP benchmarks. TII has open-sourced several Falcon LLM artifacts, including pretrained models, instruction/chat models, and the RefinedWeb dataset.

  • RefinedWeb Dataset: A massive web dataset with stringent filtering and large-scale deduplication, enabling models trained on web data alone to match or outperform models trained on curated corpora. RefinedWeb is licensed under Apache 2.0.

  • Open Call for Proposals: TII is calling for proposals from the global research community and SME entrepreneurs to submit use cases for Falcon LLM, promoting collaborations and driving innovation.

  • TII Falcon LLM License: A fork of Apache 2.0, this license allows researchers and developers to freely use TII's models for research and personal purposes. For commercial use, royalties are exempted for attributable revenues under $1 million per year; otherwise, a commercial agreement with TII is required.

TII UAE AI technology page Hackathon projects

Discover innovative solutions crafted with TII UAE AI technology page, developed by our community members during our engaging hackathons.



Problem: Businesses in the Arabian market face a unique set of challenges: they're swamped with tedious, repetitive tasks like customer service, data entry, and record management. These jobs consume a lot of time and resources, dragging down efficiency and keeping staff from focusing on more strategic work. Plus, most tech solutions out there don’t fully grasp the Arabic language or culture, making them less effective and sometimes even out of touch with local needs. Solutions: LoopX brings a game-changing solution to Arabian businesses bogged down by routine, manual tasks. We offer all of Stratups, SMEs and even Governments an easy, fast, and affordable way to run their specialized AI Agents through our platform and our support team of experts. Giving every Arabian business the easy way to build and run their own AI Agents within hours, making it possible for +23 Million Startups & SMEs to get the power of AI Agents in Fast, Smooth, and Cheap way without the need to build their AI Agents from scratch taking months of development and paying thousands of dollars. Our AI agents, crafted with a deep understanding of local culture and language, automate operations like customer support and data processing with unmatched precision. This tech leap not only streamlines workflows but also aligns perfectly with regional nuances, offering a tech ally that boosts productivity while respecting cultural identity. With LoopX, companies leap into efficient, culturally coherent automation, transforming how they operate in the digital era. Products and Services we offer: 1. AI Agents Marketplace: A dynamic platform offering specialized AI agents for business process automation. 2. Custom AI Agents Building: Tailored AI automation services for unique business needs. 3. AI Consultation Service: Expert guidance on AI adoption and strategic implementation. LoopX's journey is marked by gaining traction with 8 customers and 2 major projects, driving us close to 3K USD in revenue.

Falcon Document Parser

Falcon Document Parser

In the rapidly evolving landscape of document processing, businesses are continually seeking innovative solutions to enhance efficiency, reduce manual workload, and ensure the accuracy of data extraction from crucial documents like invoices. The document parsing application, powered by Falcon LLM, emerges as a standout solution in this domain, delivering unparalleled precision in interpreting and extracting information from varied invoice formats. Falcon LLM, a cutting-edge language learning model, is renowned for its capability to grasp and interpret the complexities of human language. This application harnesses the full potential of Falcon LLM, but it goes a step further by employing advanced fine-tuning techniques such as Parameter Efficient Fine-tuning (PEFT) and Quantized Low-Rank Adaptation(QLORA). These techniques enable the model to adapt to the specific nuances and variations present in different invoice formats, ensuring a high level of accuracy across diverse datasets. Hosting the application on Streamlit brings an additional layer of user-friendliness and accessibility to the table. Streamlit is known for its ability to rapidly deploy data applications with minimal setup, and in this case, it provides an intuitive web interface for interacting with the document parsing application. Users can upload invoices directly through the Streamlit interface, initiate the parsing process, and receive the extracted data in real-time. This not only simplifies the user experience but also makes the powerful capabilities of Falcon LLM and the fine-tuned model accessible to a broader audience, regardless of their technical expertise. The implementation of this document parsing application represents a significant leap forward in automating and optimizing the invoice processing workflow.By leveraging Falcon LLM, fine-tuning with PEFT and QLORA, providing an API endpoint for easy integration, and hosting the solution on Streamlit,it provides a friendly solution

Falcon Barsita

Falcon Barsita

Who are we? We are a new startup dedicated to revolutionizing the restaurant industry with cutting-edge AI solutions. This Hackathon provides us an opportunity to showcase an early concept of our chatbot (still significantly in development) built on top of Falcon LLM. The Problem As the restaurant industry continues to recover from the coronavirus pandemic, it is confronting numerous challenges, with the foremost and most significant being high labor costs. The Solution We introduce Falcon Barista, an order-taking bot for coffee shops and restaurants, designed to converse with customers in the most human-like manner possible. Although it is still under development, this bot is envisioned as an affordable alternative for restaurants to replace manual labor at counters, drive-throughs, and over the phone. What makes Falcon Barista better? The primary innovation behind Falcon Barista lies in its minimal compute requirements, thereby maximizing cost savings. While many chatbots are built on top of LLMs with over 100 billion parameters, Falcon Barista operates on the much more compact fine-tuned Falcon-7B LLM, which consists of only 7 billion parameters. This efficiency is realized by employing smaller fine-tuned BERT models in tandem with the Falcon LLM: the Falcon-7B LLM guides the conversation while BERT manages information extraction, such as identifying food items and their quantities. Falcon Barista utilizes a quantized version of the Falcon-7B LLM and can be deployed on a single GPU with 16GB RAM. Furthermore, it boasts Automatic Speech Recognition and Text-to-Speech capabilities, allowing for conversations with customers that mimic human-like interactions. Challenges (Due to lack of compute resources) 1. Significant latency (~10s). 2. The BERT model, still in the process of fine-tuning, can easily become confused. 3. Falcon-7B requires further fine-tuning for more efficient conversation management.