Skip to content

Using LangChain with the kluster.ai API

This guide demonstrates how to integrate the ChatOpenAI class from the langchain_openai package with the kluster.ai API. By combining LangChain’s capabilities with kluster.ai’s large language models, you can seamlessly create powerful applications.

Prerequisites

Before starting, ensure you have the following:

  • LangChain installed - install the langchain library:

    pip install langchain
    
  • A kluster.ai account - sign up on the kluster.ai platform if you don't have one

  • A kluster.ai API key - after signing in, go to the API Keys section and create a new key. For detailed instructions, check out the Get an API key guide

Integrate with LangChain

It is very simple to integrate kluster.ai with LangChain—just point your ChatOpenAI instance to the correct base URL and configure a few settings.

  • Base URL - use https://api.kluster.ai/v1 to send requests to the kluster.ai endpoint
  • API key - replace INSERT_API_KEY in the code below with your own kluster.ai API key. If you don’t have one yet, refer to the Get an API key guide
  • Select your model - choose one of kluster.ai’s available models based on your use case. For more details, see kluster.ai’s models
from langchain_openai import ChatOpenAI

llm = ChatOpenAI(
    base_url="https://api.kluster.ai/v1",
    api_key="INSERT_API_KEY", # Replace with your actual API key
    model="klusterai/Meta-Llama-3.1-8B-Instruct-Turbo",
)

llm.invoke("What is the capital of Nepal?")

That's it! You’ve successfully integrated LangChain with the kluster.ai API. Your configured LLM is now ready to deliver the full range of LangChain capabilities.