Leveraging Composio for Advanced Multi-Agent AI Applications
Leveraging Composio for Advanced Multi-Agent AI Applications
Introduction
Hello! šš½ I'm Tommy, and today we're diving into the fascinating world of AI-driven automation using the Composio framework. Composio supercharges AI agents by allowing them to interact seamlessly with over 150 tools, making it a powerful solution for automating complex tasks with minimal effort.
In this guide, we'll explore how to harness the power of Composio and the AutoGen framework to create a multi-agent system that reads customer feedback from a CSV file, analyzes the data, and generates a detailed summary report in a Google Doc. Stick around to see it all come together with hands-on examples and practical insights and with a hands-on implementation in Google Colab at the end! š
Prerequisites
Before diving into this tutorial, ensure you have the following:
- Python (3.8 or higher): Ensure Python is installed on your system. You can download it from the official Python website.
- Conda: We'll use a Conda environment to manage dependencies. If you don't have Conda installed, follow the installation guide here
- VS Code: We'll be working in Visual Studio Code for managing our project files and code. You can download it here.
- Composio account: Sign up for a free account at Composio's official website and gain access to the 150+ external tools it integrates.
- Groq API Key: You can access the API key from here after signing up.
- Google account: We'll interact with Google Docs.
Setting Up the Environment
Letās get started with setting up our environment. Hereās how to do it step by step:
Step 1: Create a Directory and Open in VS Code
First, open your terminal and create a new directory for your project. You can name the directory anything you like, but for this tutorial, we'll use composio-test
. Then, navigate into the directory.
mkdir composio-test
cd composio-test
Once inside the directory, open it in Visual Studio Code by running the following command:
code .
This opens the current directory in VS Code.
Step 2: Create and Activate a Conda Environment
Next, youāll need to create a new Conda environment. This environment will house all the dependencies needed for this project. To do this, run the command below in your vscode terminal:
conda create --name composio-test python=3.11.4
After the environment is created, activate it:
conda activate composio-test
Step 3: Create a Python File
In the VS Code directory, create a new file called app.py
. This will serve as the main file where youāll write the logic for your multi-agent system.
Step 4: Select the Correct Interpreter in VS Code
Make sure your VS Code is using the interpreter from the composio-test
Conda environment. To do this:
- Open the Command Palette in VS Code (Cmd/Ctrl + Shift + P).
- Search for "Python: Select Interpreter."
- Select the interpreter that corresponds to your new Conda environment (e.g.,
composio-test
).
It should show a path like /opt/anaconda3/envs/composio-test/bin/python
.
Step 5: Install Required Packages
Now that your environment is set up, letās install the necessary dependencies using pip. In this case, weāll install composio-autogen
and python-dotenv
:
/opt/anaconda3/envs/composio-test/bin/python3 -m pip install composio-autogen python-dotenv
Ensure you're using the correct path to your Python interpreter from your Conda environment.
Step 6: Log in to Composio
Once your packages are installed, you need to log in to Composio. Run the following command:
composio login
This will prompt you to follow the login steps, including entering a generated key.
Step 7: Add Google Docs Tool
With Composio set up, itās time to add the Google Docs tool since we'll be using it to generate a report later. Run the command below:
composio add googledocs
Building the Multi-Agent System
Now, weāll build the multi-agent system using Composio and AutoGen. The system will be split into two parts to fully showcase Composio in action and ensure proper communication between the agents.
Step 1: Create the CSV File
First, create a CSV file named dummy_customer_feedback.csv
that contains customer feedback data, ratings, names, and more. Hereās an example, but feel free to modify or add more rows to suit your use case:
user_id,username,feedback,rating,date
1,user123,The app is really helpful for learning on the go!,4.5,2024-09-01
2,learner456,"Great app, but sometimes it crashes when I try to access my profile.",3.0,2024-09-01
3,student789,I love the interactive quizzes! They help me understand concepts better.,5.0,2024-09-02
4,teacher101,The app lacks advanced features for tracking student progress.,2.5,2024-09-02
5,educator202,Very user-friendly and intuitive design.,4.0,2024-09-03
This file will serve as the data source for analysis by our agents.
Step 2: Set Up Environment Variables
Create a .env
file to store your API keys. Youāll need a Groq API key for the LLaMA3-70B model and a Composio API key, which can be found in your Composio account settings. The .env
file should look like this:
GROQ_API_KEY=your-groq-api-key
COMPOSIO_API_KEY=your-composio-api-key
Step 3: Import Required Packages
Now, let's import the necessary packages in your app.py
:
from composio_autogen import ComposioToolSet, App
from autogen import AssistantAgent, UserProxyAgent, GroupChat, GroupChatManager, ConversableAgent
import dotenv
from textwrap import dedent
import os
Step 4: Load Environment Variables
Use the dotenv
library to load your environment variables:
dotenv.load_dotenv()
Step 5: Define the LLM Configuration
Since weāre using Groq's LLaMA3-70B model, define the configuration like this:
config_list = [
{
"model": "llama3-70b-8192",
"api_key": os.environ.get("GROQ_API_KEY"),
"api_type": "groq",
}
]
llm_config = {"config_list": config_list, "timeout": 300, "temperature": 0}
Part One:
We create agents specifically for the task of reading, analyzing and providing a detailed summary report of key metrics gotten during analysis of a csv file containing customer feedback.
Step 1: Create the Data Analysis Agent
Next, define the Data Insights Specialist agent, which will provide a summary report of key metrics and trends after analyzing the CSV data.
data_analysis_agent = ConversableAgent(
"Data Insights Specialist",
llm_config=llm_config,
description=dedent("""\
Provides Summary Report of key metrics and trends gotten
from data provided by the File Parser agent.
"""),
system_message=dedent("""\
You are a data insights specialist who analyzes data from the File Parser agent and provides a detailed summary of key metrics and trends.
Return the summary report as context to the user agent.
"""),
max_consecutive_auto_reply=2,
)
Step 2: Create the File Parser Agent
The File Parser agent will handle reading, extracting, and formatting the CSV fileās content:
da_proxy_agent = ConversableAgent(
"File Parser and Extractor",
llm_config=llm_config,
description=dedent("""\
Reads, Parses CSV files and Returns a formatted output using
the column names in the first row of the CSV file.
"""),
system_message=dedent("""\
You are a file parser that reads CSV files and returns a formatted output based on the columns in the first row.
Stop the process with the message "TERMINATE".
"""),
human_input_mode="NEVER",
code_execution_config={"use_docker": False},
max_consecutive_auto_reply=5,
)
Step 3: Create the User Agent
The User Agent will collect and return the final summary from the Data Insights Specialist:
user_agent = UserProxyAgent(
"User",
description=dedent("""\
Collects and returns the final response from the Data Insights Specialist.
"""),
human_input_mode="NEVER",
code_execution_config={"use_docker": False},
max_consecutive_auto_reply=5,
)
Step 4: Create Group Chat and Manager
Now, weāll set up a GroupChat and GroupChatManager to facilitate communication between the agents:
group_chat = GroupChat(
agents=[data_analysis_agent, da_proxy_agent, user_agent],
messages=[],
send_introductions=True,
max_round=5,
)
group_chat_manager = GroupChatManager(
groupchat=group_chat,
llm_config=llm_config,
)
Step 5: Initialize the Composio Toolset
We now initialize the Composio Toolset, which provides access to all the tools available for our agents:
composio_toolset = ComposioToolSet(api_key=os.getenv("COMPOSIO_API_KEY"))
Step 6: Register Tools for the Agents
We register the tools that will be used by the File Parser and Data Insights Specialist agents:
composio_toolset.register_tools(
apps=[App.FILETOOL],
caller=data_analysis_agent,
executor=da_proxy_agent,
)
By settings the apps parameter our agent has access to all the actions in the FILETOOL app. We can also streamline the kind of access our agent has by settings the actions parameter and passing the list of actions we want the agent to only have access to.
Step 7: Define the Task and Execute the Chat
Define the task that needs to be performed, including the CSV filename:
filename = "dummy_user_feedback"
task = f"Generate a Detailed Summary report on the csv file {filename} in my current directory "
response = user_agent.initiate_chat(
recipient=group_chat_manager,
message=task,
summary_method="last_msg"
)
Finally, retrieve the summary report provided by the Data Insights Specialist:
summary_report = response.summary
Part 2:
In Part Two, we'll extend the system built in Part One to write the summary report to a Google Document using Composio and AutoGen. We'll create new agents specifically for this task and utilize Google Docs tools available through Composio.
Step 1: Create Agents for Google Docs Writing
We start by defining the agents responsible for writing the summary report to a Google Document.
- Document Writer Agent: This agent initiates the write operation to the Google Document.
doc_writer_agent = AssistantAgent(
"Document Writer",
description=dedent("""
Starts the write operation to a Google Document using the available tools provided.
"""),
llm_config=llm_config,
human_input_mode="NEVER",
system_message=dedent("""
You are a document writer agent that writes a summary report to a Google Document using the available tools provided.
"""),
max_consecutive_auto_reply=2
)
- Content Writer Agent: This agent is responsible for writing the content to the Google Document.
content_writer = UserProxyAgent(
"Content Writer",
description="Writes content to a Google Document",
human_input_mode="NEVER",
code_execution_config={"use_docker": False},
max_consecutive_auto_reply=5
)
- User Agent: This agent ensures the summary report is correctly written to the Google Document.
user_agent = UserProxyAgent(
"User",
description=dedent("""\
Ensures the summary report is written to a Google Document.
"""),
human_input_mode="NEVER",
code_execution_config={"use_docker": False},
max_consecutive_auto_reply=5
)
Step 2: Set Up Group Chat and Manager
We then set up a new GroupChat
and GroupChatManager
to coordinate these agents:
group_chat = GroupChat(
agents=[doc_writer_agent, content_writer, user_agent],
messages=[],
send_introductions=True,
max_round=5
)
group_chat_manager = GroupChatManager(
groupchat=group_chat,
llm_config=llm_config,
)
Step 3: Register the Google Docs Tool
Next, we register the Google Docs tool in Composio for our agents:
composio_toolset.register_tools(
apps=[App.GOOGLEDOCS],
caller=doc_writer_agent,
executor=content_writer,
)
Step 4: Write the Summary to Google Docs
Now, we initiate the chat where the summary report generated in Part One is written to a Google Document titled "User Feedback Report"
:
res2 = user_agent.initiate_chat(
recipient=group_chat_manager,
message=f"Write the summary report '{summary_report}' to a Google Document named 'User Feedback Report'.",
)
Running the Python File
Now that our multi-agent system is fully set up, it's time to see it in action! We'll execute our Python script using the terminal in VSCode to observe the magic unfold.
In your VSCode terminal, ensure you're in the root directory where your app.py
file is located. Then, execute the script by running:
/opt/anaconda3/envs/composio-test/bin/python3 app.py
Ensure youāre using the path to your conda environment
Review Execution Flow
Once you run your Python file, the logs for each agent and tool should be visible, as verbosity is enabled by default. Now, letās see the power of Composio tooling in action.
You should see all the tools available in FILETOOL, with the Data Insights Specialist Agent suggesting tool actions to perform based on the task.
The agent then delegates the execution to the FILE PARSER and EXTRACTOR agent, which we defined earlier as the executor for the FILETOOL.
After the Extractor agent executes all the suggested tool calls by the Data Specialist agent, it returns all the fields in the CSV file in a formatted way.
Then it gives the data to the Data Specialist Agent to generate a detailed report summary based on it. The Data Specialist Agent then generates the report summary.
Next, the User Agent kicks off by passing the summary report provided by the Data Specialist Agent to the Group Chat Manager, which then calls the Document Writer Agent.
The Document Writer agent, armed with all actions for interacting with Google Docs, now suggests tool calls for the Content Writer Agent to perform.
The Content Writer Agent then starts execution and writes the summary to the Google document as defined by the Document Writer Agent.
After execution, check your Google Drive recents, and you should see the file named 'User Feedback Report' with the summary report.
Next Steps: Expanding Your Use of Composio
Now that youāve successfully built and run a multi-agent system with Composio, there are several exciting avenues to explore next:
- Utilizing Triggers for Automated Actions: Triggers are a powerful feature in Composio that can automate workflows based on specific events or conditions. You can set up triggers to initiate actions automatically when certain criteria are met, like sending notifications, updating databases, or executing specific tasks in response to changes in your data. This can make your multi-agent system even more responsive and autonomous, reducing the need for manual intervention.
- Integrating Additional Tools and APIs: Composio supports a wide range of tools and APIs that you can integrate into your system. For instance, you could add tools for social media analytics, database management, or cloud storage, enabling your agents to perform even more diverse tasks. Exploring the Composio marketplace will give you ideas on expanding your systemās capabilities.
- Exploring Advanced Agent Interactions: While this tutorial focused on a straightforward multi-agent setup, Composio allows for much more complex agent interactions. You could experiment with hierarchical agent structures, where certain agents manage or oversee others, or explore cooperative learning, where agents share knowledge to improve performance over time.
- Expanding to Other AI Models: Although you used the Groq Llama3-70b model in this tutorial, Composio supports various AI models. You could experiment with different models or even integrate custom-trained models to tailor your agents' performance to specific needs.
Conclusion
In this tutorial, we explored building a multi-agent system using Composio and AutoGen, focusing on analyzing customer feedback from a CSV file and generating a summary report. We walked through setting up the environment, configuring the agents, and using tools to read and analyze data, followed by writing the summary to a Google document. This process demonstrated how Composio can be leveraged for complex workflows involving multiple agents while ensuring proper context and coordination.
During the project, I encountered challenges with the OpenAI API and retrieving summary reports in a single step. To overcome these issues, I switched to using the Groq API and split the process into two parts, which successfully achieved the desired outcome. These experiences highlighted both the flexibility of Composio and the importance of troubleshooting and adapting when working with new tools and frameworks.
Composio's vast array of tools and easy integration make it a powerful choice for developers looking to enhance their projects with AI capabilities. I encourage you to explore Composio further by checking out the official documentation. There's a lot more to discover, and I'm excited to continue experimenting with this versatile framework.
Happy coding! š