Beginner’s Guide to Creating AI Agents With LangChain
Now, we come to the most exciting part of using LangChain which is that of creating AI Agents.
In the previous articles, we learnt how to use chains in order to get an LLM to respond to our questions. We also learnt how to get the LLM to use external sources of data as context, as well as use the chat history as context for generating its response.
Difference between Chains and AI Agents
In chains, a sequence of actions is hardcoded (in code). But, in agents, the actions are not hardcoded. We let the language model decide which actions to take and in which order.
In an AI agent, we are equipping the LLM with the ability to reason and determine the next action. Soon, you will see how fascinating it is to observe the reasoning of an LLM.
Tools
AI Agents are provided with a set of tools. They are free to decide which tool they can use based on the tool’s description and the task at hand.
The retriever mentioned in the previous articles which searches a vector database for relevant pieces of information is a tool.
Another simple example of a tool is Web Search. There are multiple APIs available that allow AI Agents to search the web in order to find a particular piece of information based on the task given to them.
The langchain_community.tools package has a huge list of tools that can be provided to the AI Agent.
Retriever Tool
A retriever tool is used to retrieve relevant data from a vector database based on the user input and sent to an LLM as context for the user input. This is done when the source data is large and cannot be send as part of the user input in the prompt. In the below example of retriever tool, data from a website https://www.dasa.org is stored in a vector database, and the retriever tool is used to retrieve relevant information from the vector database to be passed to the LLM as context.
In order to provide the tools to the AI Agent, we first need to create the retriever and then import the tool.
To know more creating the retriever, please read my earlier article on Creating Retrieval Chains Using LangChain.
from langchain_community.document_loaders import WebBaseLoader
loader = WebBaseLoader("https://www.dasa.org")
docs = loader.load()
from langchain_text_splitters import RecursiveCharacterTextSplitter
text_splitter = RecursiveCharacterTextSplitter()
documents = text_splitter.split_documents(docs)
from langchain_openai import OpenAIEmbeddings
embeddings = OpenAIEmbeddings()
from langchain_community.vectorstores import FAISS
vector = FAISS.from_documents(documents, embeddings)
retriever = vector.as_retriever()
Let’s now import the create_retriever_tool package. We import it as below:
from langchain.tools.retriever import create_retriever_tool
Next, we create the tool by giving it a name and description.
retriever_tool = create_retriever_tool (retriever, "DASA_search", "Search for information about DASA. For any questions about DASA, you must use this tool")
Web Search Tool
Let’s create another tool that can be used to search the Internet. This can be done through the Tavily Search tool for which you need an API that you can get for free at tavily.com.
import os
os.environ['TAVILY_API_KEY'] = "<Your Tavily API Key here>"
from langchain_community.tools.tavily_search import TavilySearchResults
search = TavilySearchResults()
Now, we have two tools that we can give to the AI Agent. Let us put them in a list.
tools = [retriever_tool, search]
Create the AI Agent
Now that we have the tools, we can create an agent to use them.
First, install langchain hub using the terminal.
pip install langchainhub
Install the langchain-openai package.
pip install langchain-openai
Next, we will get a predefined prompt from a prompt hub that will be useful to get the llm to act as an AI Agent.
from langchain import hub
prompt = hub.pull("hwchase17/openai-functions-agent")
Let’s learn more about this prompt that makes the llm act as an AI agent.
The prompt is pulled from a hub of prompts and is specifically designed for creating AI agents.
It consists of an Agent Scratch Pad which is used by the llm to think about possible actions and deciding on an action to proceed towards its goal.
We will cover the fascinating details about how AI agents think in the next article.
Now, let’s get an llm ready.
from langchain_openai import ChatOpenAI
llm = ChatOpenAI(api_key="Your API Key here")
Next, let us import some libraries from LangChain to create agents and execute agents.
from langchain.agents import create_openai_functions_agent
from langchain.agents import AgentExecutor
We are now ready to create an AI agent. The AI agent needs an llm, tools and a prompt.
agent = create_openai_functions_agent(llm,tools,prompt)
agent_executor=AgentExecutor(agent=agent,tools=tools,verbose=True)
Invoke the AI Agent
All that is needed to be done is to invoke the AI Agent.
agent_executor.invoke({"input":"What is the mission of DASA"})
The output is:
> Entering new AgentExecutor chain…
Invoking: DASA_search with {‘query’: ‘mission of DASA’}
{'input': 'What is the mission of DASA',
'output': 'The mission of DASA (DevOps Agile Skills Association) is to facilitate the journey towards flow, business agility, and value maximization. DASA aims to help organizations transform into high-performance digital organizations by prioritizing customer-focused innovation, fostering a culture of agility and teamwork, and emphasizing key principles crucial for effective adoption and transition to a DevOps-centric approach. DASA offers a portfolio of talent and guidance products to boost enterprise transformation success and provides a collaborative space for leaders and professionals to share ideas and adopt best practices in the field.'}
Based on the user input, the AI agent decided to use the retriever tool.
Let’s ask some other question not related to DASA
agent_executor.invoke({"input":"What is weather in Kuala Lumpur?"})
The output is:
> Entering new AgentExecutor chain…
Invoking: tavily_search_results_json with {‘query’: ‘weather in Kuala Lumpur’}
{'input': 'What is weather in Kuala Lumpur?',
'output': 'The current weather in Kuala Lumpur is partly cloudy with a temperature of 29.0°C (84.2°F). The humidity is at 84% and the wind speed is 3.6 km/h coming from the east-southeast direction. The visibility is 10.0 km and the UV index is 7.0.'}
This time, the AI agent decided to use the Tavily Search tool to search the Internet instead of the retriever tool.
Congratulations! You have created a thinking AI agent that can decide on its own what step to take.
In the next article, we will have a peep into the “mind” of an AI Agent to know exactly what “thoughts” make the AI Agent decide on the next step.
Previous Articles:
Beginner’s Guide To Retrieval Chain From LangChain
Beginner’s Guide to Conversational Retrieval Chain From LangChain