Note: there is a Python package for Nebius Langchain https://pypi.org/project/langchain-nebius/
This approach is an alternative approach with LiteLLM
Nebius AI Studio is a platform from Nebius that simplifies the process of building applications using AI models. It provides a suite of tools and services for developers to easily test, integrate and fine-tune various AI models, including those for text and image generation.
You can checkout the list of available models here.
LangChain is one of the most popular open source frameworks for building Generative AI / LLM based applications.
Thus in this blog we will cover how you can use LiteLLM to access Nebius AI Studio Models when building LLM/Agentic Applications with Langchain and Langgraph (LangGraph is an open source AI agent framework designed to build, deploy and manage complex generative AI agent workflows).
LiteLLM Nebius AI provider
LiteLLM is a library and proxy server that simplifies interactions with various Large Language Model (LLM) APIs, allowing developers to use a consistent interface with over 100 different LLMs. It essentially provides a standardized OpenAI API format for these different providers, making it easier to switch between them without rewriting code.
LiteLLM includes Nebius AI Studio as one of the LLM providers.
In order to use a Nebius Model with LiteLLM, ensure you have the NEBIUS_API_KEY set as an environment variable.
os.environ['NEBIUS_API_KEY'] = "insert-your-nebius-ai-studio-api-key"
This is how we can use a Nebius Model:
model="nebius/Qwen/Qwen3-235B-A22B"
Using LiteLLM with LangChain
You can use the ChatLiteLLM
chat model from the LangChain Community models.
from langchain_community.chat_models import ChatLiteLLM
And here's how you can use Nebius AI model with ChatLiteLLM
from langchain_community.chat_models import ChatLiteLLM
# Create LLM class
llm = ChatLiteLLM(model="nebius/Qwen/Qwen3-235B-A22B")
Creating an ReAct Agent with Langgraph
Once you have initialized the LLM, you can now use to create agents with Langgraph.
In the example below example, we are using Langgraph to create a ReAct agent that can interact with a Couchbase database using Model Context Protocol. ReAct Agents iteratively think, use tools, and act on observations to achieve user goals, dynamically adapting their approach. LangGraph offers a prebuilt ReAct agent (create_react_agent).
from langchain_mcp_adapters.tools import load_mcp_tools
from langgraph.prebuilt import create_react_agent
from langgraph.checkpoint.memory import InMemorySaver
from langchain_community.chat_models import ChatLiteLLM
import os
# env variable
os.environ['NEBIUS_API_KEY']
llm = ChatLiteLLM(
model="nebius/Qwen/Qwen3-235B-A22B",
)
async def main():
async with stdio_client(server_params) as (read, write):
async with ClientSession(read, write) as session:
# Initialize the connection
print("Initializing connection...")
await session.initialize()
# Get tools
print("Loading tools...")
tools = await load_mcp_tools(session)
# Create and run the agent
print("Creating agent...")
checkpoint = InMemorySaver()
agent = create_react_agent(
llm,
tools,
prompt=system_prompt,
checkpointer=checkpoint
)
print("-"*25, "Starting Run", "-"*25)
await qna(agent)
The main function ties everything together to set up and run our agent:
- Start & Connect to MCP Server: It first starts the mcp-server-couchbase process using stdio_client and establishes a communication ClientSession with it.
- Initialize Session & Load Tools: The session is initialized. Then, load_mcp_tools queries the MCP server to get the available Couchbase tools and prepares them for LangChain.
- Set Up Agent Memory: InMemorySaver is created to allow the agent to remember conversation history.
- Create ReAct Agent: The create_react_agent function builds our AI agent, providing it with the language model, the Couchbase tools, our system_prompt, and the checkpoint for memory.
- Run Q&A: Finally, it calls the qna function, passing the created agent to start the question-and-answer process with the database.
You dive deeper into this example, you can checkout the Github Repository for this project.
Conclusion
While Nebius AI Studio doesn't yet have direct integrations with LangChain, the flexibility of the AI dev ecosystem means youโre not stuck waiting. With LiteLLM, you can use Nebius Models with LangChain/Langgraph to build agentic applications.
Top comments (1)
Great tutorial! Fun fact: LiteLLM supports more than 100 LLM providers, making it super versatile for integrating new models like Nebius AI Studio into your workflow.