Llama 2 Tutorial: How to build an app with Llama 2 with Clarifai integration

Wednesday, August 23, 2023 by abdibrokhim
Llama 2 Tutorial: How to build an app with Llama 2 with Clarifai integration

Introduction

Clarifai is a platform that allows you to discover, build and share AI models, workflows and app components. It is a great low code, no code platform for developers who want to build AI-powered apps.

Llama-2 is a series of pre-trained and fine-tuned Large Language Models (LLMs) created by the Meta AI research team. It builds upon the success of its predecessor, Llama-1, and incorporates improvements to enhance performance and safety. Llama-2 models are designed for complex reasoning tasks across various domains and excel in dialogue scenarios, such as chatbot and conversational AI applications.

Specifically, Llama 2-Chat models are optimized for dialogue, generating human-like responses in natural language. These models, like the 70B version, are pre-trained on a substantial dataset that includes chat logs and social media posts, allowing them to understand and produce contextually relevant responses.

Llama 2-Chat models also undergo fine-tuning to ensure safety and helpfulness in their generated responses. This includes measures to prevent offensive or harmful content and to provide accurate information. The models possess a longer context window compared to Llama-1, allowing them to process more information and support longer conversations or document understanding.

Llama-2-Chat finds application across diverse domains, such as offering travel advice, providing mental health support, assisting with educational concepts, and acting as personal assistants. However, Llama-2's proficiency in non-English languages remains limited, and there is a potential risk of generating biased or harmful content due to the nature of the training data.

Llama-2 has undergone evaluation for pretraining, fine-tuning, and safety, demonstrating its strong performance on various NLP benchmarks and its relative safety for production use. It has outperformed other models in terms of helpfulness in human evaluations.

Let's get started

Create an account on Clarifai

Go to Clarifai and create an account or login if you already have one.

Create an account Clarifai
Create an account Clarifai

Create a new app

Once you have logged in, you will be redirected to the dashboard. Click on the Create an App button.

Create an App
Create an App

Give your app a name, short description and hit the Create App button.

Create an App
Create an App

If everything goes well, you will be redirected to the app page. Optionally, you can add cover image for your app.

App page
App page

Create a new workflow

From the app page, select Workflows from the left sidebar and click on the Create Workflow button.

Create Workflow
Create Workflow

It opens no code space where you can create your workflow. In the left sidebar, you can see the list of available components. In the middle, you can see the canvas where you can drag and drop components to create your workflow. And in the right sidebar, you can see the properties of the selected component. Firsly, change the default workflow's name to something readable: Llama2TutorialWorkflow, then in the left sidebar, search for Text-to-text component, drag and drop it to the canvas and connect with IN. In the right sidebar, you can see the properties of the selected component. Select llama2-70b-chat model from the dropdown list. Then, click on the Save Workflow button.

Create Workflow
Create Workflow
Create Workflow
Create Workflow
Create Workflow
Create Workflow
Create Workflow
Create Workflow
Create Workflow
Create Workflow

Perfect! You have created your first workflow. Now, let's test it.

Test your workflow

Click on the + button, then enter your desired text: I have a headache. What should I do? and click on the Submit button. Wait for a few seconds, and you will see the response from the model on the right side of the screen. In addition to the response, you can go through JSON response from the model by clicking on the View JSON button.

Create Workflow
Create Workflow
Create Workflow
Create Workflow
Create Workflow
Create Workflow

Let's dive deeper

Create a new module

But before, let's create a new Streamlit app. Open your Visual Studio Code and create a new file: app.py. Here we will create simple UI for our app. Copy and paste the following code:

import streamlit as st
from streamlit_chat import message
import llama

def clear_chat():
    st.session_state.messages = [{"role": "assistant", "content": "Say something to get started!"}]


st.title("Llama2 Clarifai Tutorial")


if "messages" not in st.session_state:
    st.session_state["messages"] = [{"role": "assistant", "content": "Say something to get started!"}]


with st.form("chat_input", clear_on_submit=True):
    a, b = st.columns([4, 1])

    user_prompt = a.text_input(
        label="Your message:",
        placeholder="Type something...",
        label_visibility="collapsed",
    )

    b.form_submit_button("Send", use_container_width=True)


for msg in st.session_state.messages:
    message(msg["content"], is_user=msg["role"] == "user")


if user_prompt:

    print('user_prompt: ', user_prompt)

    st.session_state.messages.append({"role": "user", "content": user_prompt})
    
    message(user_prompt, is_user=True)

    response = llama.get_response(user_prompt)  # get response from llama2 API (in our case from Workflow we created before)

    msg = {"role": "assistant", "content": response}

    print('st.session_state.messages: ', st.session_state.messages)

    st.session_state.messages.append(msg)

    print('msg.content: ', msg["content"])

    message(msg["content"])


if len(st.session_state.messages) > 1:
    st.button('Clear Chat', on_click=clear_chat)

Go to the Llama2TutorialWorkflow, click on the Use Workflow, from tab select Call by API, then click Copy Code.

Use Workflow
Use Workflow
Use Workflow
Use Workflow

Now, create a new file: llama.py. Paste the code to the llama.py file. Then, modify the code as follows:

######################################################################################################
# In this section, we set the user authentication, user and app ID, model details, and the URL of 
# the text we want as an input. Change these strings to run your own example.
######################################################################################################

from clarifai_grpc.channel.clarifai_channel import ClarifaiChannel
from clarifai_grpc.grpc.api import resources_pb2, service_pb2, service_pb2_grpc
from clarifai_grpc.grpc.api.status import status_code_pb2
import streamlit as st

# Your PAT (Personal Access Token) can be found in the portal under Authentification
PAT = st.secrets.PAT
# Specify the correct user_id/app_id pairings
# Since you're making inferences outside your app's scope
USER_ID = st.secrets.USER_ID
APP_ID = st.secrets.APP_ID
# Change these to whatever model and text URL you want to use
WORKFLOW_ID = 'Llama2TutorialWorkflow'

############################################################################
# YOU DO NOT NEED TO CHANGE ANYTHING BELOW THIS LINE TO RUN THIS EXAMPLE
############################################################################

def get_response(prompt):
    channel = ClarifaiChannel.get_grpc_channel()
    stub = service_pb2_grpc.V2Stub(channel)

    metadata = (('authorization', 'Key ' + PAT),)

    userDataObject = resources_pb2.UserAppIDSet(user_id=USER_ID, app_id=APP_ID)

    response = ""  # save response from the model

    post_workflow_results_response = stub.PostWorkflowResults(
        service_pb2.PostWorkflowResultsRequest(
            user_app_id=userDataObject,  
            workflow_id=WORKFLOW_ID,
            inputs=[
                resources_pb2.Input(
                    data=resources_pb2.Data(
                        text=resources_pb2.Text(
                            raw=prompt
                        )
                    )
                )
            ]
        ),
        metadata=metadata
    )

    if post_workflow_results_response.status.code != status_code_pb2.SUCCESS:
        print(post_workflow_results_response.status)

        return response

    # We'll get one WorkflowResult for each input we used above. Because of one input, we have here one WorkflowResult
    results = post_workflow_results_response.results[0]

    # Each model we have in the workflow will produce one output.
    for output in results.outputs:
        model = output.model

        print("Predicted concepts for the model `%s`" % model.id)
        for concept in output.data.concepts:
            print("	%s %.2f" % (concept.name, concept.value))

        response += output.data.text.raw + "\n"

    # Uncomment this line to print the full Response JSON
    # print(results)
    print(response)

Next, create requirements.txt file and add the following packages:

clarifai-grpc
streamlit
streamlit-chat

Create a new github repository and push your code to the repository.

Perfect!

Now, from Clarifai app page, select Modules from the left sidebar and click on the Create Module button.

Create Module
Create Module

Fill in the required fields and click on the Create Module button.

Create Module
Create Module

Scroll down a little bit and enter your github repository url with branch specified. Fill in the required fields and click on the Create Module Version button.

Create Module
Create Module
Create Module
Create Module

Wait for a few minutes until the module is ready. Once it is ready, scroll all the way down and click on the Install Module - Install to this App - Click to use installed module version - Authorize buttons.

Install Module
Install Module
Install Module
Install Module
Install Module
Install Module
Authorize
Authorize

Cool! Now, you should see the following screen:

App UI
App UI

Feel free to test your app and play around with it.