Integrate LangChain with kluster.ai#
LangChain offers a range of features—like memory modules for context tracking, retrieval augmentation to feed external data into prompts, and customizable multi-step “chains" to break down complex tasks. By leveraging these capabilities with the kluster.ai API, you can build more robust and context-aware solutions that seamlessly handle everything from short-form answers to intricate conversations.
This guide demonstrates how to integrate the ChatOpenAI
class from the langchain_openai
package with the kluster.ai API, then walks through building a multi-turn conversational agent that leverages LangChain's memory for context-aware interactions.
Prerequisites#
Before starting, ensure you have the following:
- A kluster.ai account - sign up on the kluster.ai platform if you don't have one
- A kluster.ai API key - after signing in, go to the API Keys section and create a new key. For detailed instructions, check out the Get an API key guide
- A python virtual environment - this is optional but recommended. Ensure that you enter the Python virtual environment before following along with this tutorial
-
LangChain packages installed - install the
langchain
packages:pip install langchain langchain_community langchain_core langchain_openai
As a shortcut, you can also run:
pip install "langchain[all]"
Quick Start#
It's easy to integrate kluster.ai with LangChain—when configuring the chat model, point your ChatOpenAI
instance to the correct base URL and configure the following settings:
- Base URL - use
https://api.kluster.ai/v1
to send requests to the kluster.ai endpoint - API key - replace
INSERT_API_KEY
in the code below with your kluster.ai API key. If you don't have one yet, refer to the Get an API key guide - Select your model - choose one of kluster.ai's available models based on your use case
from langchain_openai import ChatOpenAI
llm = ChatOpenAI(
base_url="https://api.kluster.ai/v1",
api_key="INSERT_API_KEY", # Replace with your actual API key
model="klusterai/Meta-Llama-3.1-8B-Instruct-Turbo",
)
llm.invoke("What is the capital of Nepal?")
That's all you need to start with LangChain and the kluster.ai API! Next, this guide will explore building a multi-turn conversational agent that showcases how memory and context can elevate your chatbot to a more interactive, intelligent experience.
Build a multi-turn conversational agent#
This section will explore what LangChain can do beyond a single prompt-and-response interaction. One standout feature of LangChain is its built-in memory, which tracks conversation context across multiple user queries. In the following steps, you'll set up a multi-turn conversational agent that takes advantage of this memory and seamlessly integrates with the kluster.ai API.
-
Create file - create a new file called
langchain-advanced.py
using the following command in your terminal:touch langchain-advanced.py
-
Import LangChain components - inside your new file, import the following components for memory management, prompt handling, and kluster.ai integration:
from langchain.chains.conversation.memory import ConversationBufferMemory from langchain_community.chat_message_histories import ChatMessageHistory from langchain_core.messages import HumanMessage from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder from langchain_openai import ChatOpenAI
- Create a memory instance - to store and manage the conversation's context, allowing the chatbot to remember previous user messages.
# Create a memory instance to store the conversation message_history = ChatMessageHistory() memory = ConversationBufferMemory(memory_key="chat_history", return_messages=True)
- Configure the
ChatOpenAI
model - point to kluster.ai's endpoint with your API key and chosen model. Remember, you can always change the selected model based on your needs# Create your LLM, pointing to kluster.ai's endpoint llm = ChatOpenAI( base_url="https://api.kluster.ai/v1", api_key="INSERT_API_KEY", model="klusterai/Meta-Llama-3.1-8B-Instruct-Turbo", )
- Define a prompt template - include a system instruction for the assistant, a placeholder for the conversation history, and an input slot for the user's query
# Define the prompt template, including the system instruction and placeholders prompt = ChatPromptTemplate.from_messages([ ("system", "You are a helpful assistant."), MessagesPlaceholder(variable_name="chat_history"), ("human", "{input}") ])
- Create the
ConversationChain
- pass in the LLM, memory, and this prompt template so every new user query is automatically enriched with the stored conversation context and guided by the assistant's role# Create the conversation chain conversation = ConversationChain( llm=llm, memory=memory, prompt=prompt )
- Prompt the model with the first question - you can prompt the model with any question. The example chosen here is designed to demonstrate context awareness between questions
# Send the first user prompt question1 = "Hello! Can you tell me something interesting about the city of Kathmandu?" print("Question 1:", question1) response1 = conversation.predict(input=question1) print("Response 1:", response1)
- Pose a follow-up question - ask another question without resupplying the city name and notice how LangChain's memory implicitly handles the context. Return and print the questions and responses to see how the conversation informs each new query to create multi-turn interactions
# Send a follow-up question referencing previous context question2 = "What is the population of that city?" print("\nQuestion 2:", question2) response2 = conversation.predict(input=question2) print("Response 2:", response2)
View complete script
from langchain.chains import ConversationChain
from langchain.chains.conversation.memory import ConversationBufferMemory
from langchain_community.chat_message_histories import ChatMessageHistory
from langchain_core.messages import HumanMessage
from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder
from langchain_openai import ChatOpenAI
# Create a memory instance to store the conversation
message_history = ChatMessageHistory()
memory = ConversationBufferMemory(memory_key="chat_history", return_messages=True)
# Create your LLM, pointing to kluster.ai's endpoint
llm = ChatOpenAI(
base_url="https://api.kluster.ai/v1",
api_key="INSERT_API_KEY",
model="klusterai/Meta-Llama-3.1-8B-Instruct-Turbo",
)
# Define the prompt template, including the system instruction and placeholders
prompt = ChatPromptTemplate.from_messages([
("system", "You are a helpful assistant."),
MessagesPlaceholder(variable_name="chat_history"),
("human", "{input}")
])
# Create the conversation chain
conversation = ConversationChain(
llm=llm,
memory=memory,
prompt=prompt
)
# Send the first user prompt
question1 = "Hello! Can you tell me something interesting about the city of Kathmandu?"
print("Question 1:", question1)
response1 = conversation.predict(input=question1)
print("Response 1:", response1)
# Send a follow-up question referencing previous context
question2 = "What is the population of that city?"
print("\nQuestion 2:", question2)
response2 = conversation.predict(input=question2)
print("Response 2:", response2)
Put it all together#
-
Use the following command to run your script:
python langchain-advanced.py
-
You should see output that resembles the following:
python langchain.py Question 1: Hello! Can you tell me something interesting about the city of Kathmandu? Response 1: Kathmandu, the capital city of Nepal, is indeed a treasure trove of history, culture, and natural beauty. Here's something interesting: Kathmandu is home to the famous Boudhanath Stupa, a UNESCO World Heritage Site. It's one of the largest Buddhist stupas in the world and is considered a sacred site by Buddhists. The stupa is over 36 meters (118 feet) high and is built in a unique octagonal shape. Its massive size is so prominent that it can be seen from many parts of the city. Another fascinating fact is that Kathmandu has managed to conserve its rich cultural heritage, which dates back to the 12th century. You can see ancient temples, palaces, streets, and marketplaces that have been beautifully preserved and restored. Lastly, Kathmandu is also known for its Newar culture, which is the indigenous culture of the city. The Newars have a rich tradition of art, music, and cuisine, which is reflected in the vibrant festivals and celebrations that take place throughout the year. Would you like to know more about Kathmandu's culture, history, or maybe some of its modern attractions? Question 2: What is the population of that city? Response 2: Kathmandu, the capital city of Nepal, has a population of around 374,405 people (as per the 2021 estimates). However, the Kathmandu Valley, which includes the surrounding municipalities and areas, has a population of over 3.2 million people. When considering the larger metropolitan area that includes the neighboring cities like Lalitpur (Patan) and Bhaktapur, the population exceeds 5 million people, making it one of the largest urban agglomerations in Nepal. It's worth noting that Nepal's population density is relatively high, with many people living in urban areas. The Kathmandu Valley, in particular, is one of the most densely populated regions in the country.
That's it! You've successfully integrated LangChain with the kluster.ai API, and your configured multi-turn conversational agent is ready to leverage the power of LangChain and the kluster.ai API. For more information about the capabilities of LangChain, be sure to check out the LangChain docs.