LangGraph Basic Chatbot with AgentOps
This example shows you how to build a basic chatbot using LangGraph’s StateGraph with comprehensive tracking via AgentOps.What We’re Building
A stateful chatbot using LangGraph fundamentals:- 🗃️ StateGraph: Core LangGraph structure for managing conversation state
- 💬 Chat Model: LLM integration for generating responses
- 🔄 State Management: Automatic message history tracking with
add_messages
- 🎯 Graph Flow: START → chatbot node → END pattern
Step-by-Step Implementation
Step 1: Install Dependencies
Install LangGraph with your preferred chat model and AgentOps for tracking:- 📊 Graph execution tracking with node transitions and timing
- 💰 LLM cost monitoring with token usage breakdown
- 🔄 State change visualization showing message flow
- 📈 Performance metrics for each graph execution
- 🐛 Execution replay for debugging graph flows
Step 2: Create Your Project Structure
Create a simple Python project for your chatbot:Step 3: Set Up Environment Variables
Create your.env
file with the necessary API keys:
- OpenAI API Key: OpenAI Platform
- AgentOps API Key: AgentOps Settings
Step 4: Build Your Basic Chatbot
Editchatbot.py
to create your LangGraph chatbot:
Step 5: Initialize the Chat Model and Create the Chatbot Node
Add the model and node function:Step 6: Build and Compile the StateGraph
Construct your LangGraph workflow:Step 7: Add the Streaming Chat Function
Create the interactive chat interface with AgentOps tracking:Step 8: Run Your Chatbot
Execute your chatbot:- AgentOps session starts automatically
- Interactive chat loop begins
- Each user message flows through: START → chatbot node → END
- LLM generates responses based on conversation history
- AgentOps captures all state transitions and LLM interactions
- Session ends when you type ‘quit’
View Results in AgentOps Dashboard
After running your chatbot, visit your AgentOps Dashboard to see:- Graph Structure: Visual representation of your StateGraph (START → chatbot → END)
- State Transitions: How messages flow through the graph
- LLM Interactions: Every conversation turn with prompts and responses
- Execution Timing: How long each node takes to process
- Session Analytics: Conversation length, token usage, and costs
- Message History: Complete conversation flow with state management
Key Files Created
Project structure you built:chatbot.py
- Complete LangGraph chatbot with AgentOps integration.env
- API keys for OpenAI and AgentOps
agentops.init()
- Enables automatic LangGraph instrumentationagentops.start_session()
- Begins tracking each chat sessionagentops.end_session()
- Completes the session with status
Next Steps
- Add tools to your chatbot (web search, calculators, etc.)
- Implement more complex graph structures with conditional edges
- Add memory persistence across sessions
- Create multi-agent workflows with LangGraph
- Use AgentOps analytics to optimize conversation flows