Core Concepts
LangGraph enables you to build complex agentic workflows as graphs with:- Nodes: Individual steps in your workflow (agents, tools, functions)
- Edges: Connections between nodes that define flow
- State: Shared data that flows through the graph
- Conditional Edges: Dynamic routing based on state or outputs
- Cycles: Support for iterative workflows and feedback loops
Installation
Install AgentOps and LangGraph along with LangChain dependencies:Setting Up API Keys
You’ll need API keys for AgentOps and your LLM provider:- OPENAI_API_KEY: From the OpenAI Platform
- AGENTOPS_API_KEY: From your AgentOps Dashboard
.env
file.
Usage
Initialize AgentOps at the beginning of your application to automatically track all LangGraph operations:What Gets Tracked
AgentOps automatically captures:- Graph Structure: Nodes, edges, and entry points during compilation
- Execution Flow: The path taken through your graph
- Node Executions: Each node execution with inputs and outputs
- LLM Calls: All language model interactions within nodes
- Tool Usage: Any tools called within your graph
- State Changes: How state evolves through the workflow
- Timing Information: Duration of each node and total execution time
Advanced Example
Here’s a more complex example with conditional routing and tools:Dashboard Insights
In your AgentOps dashboard, you’ll see:- Graph Visualization: Visual representation of your compiled graph
- Execution Trace: Step-by-step flow through nodes
- Node Metrics: Performance data for each node
- LLM Analytics: Token usage and costs across all model calls
- Tool Usage: Which tools were called and their results
- Error Tracking: Any failures in node execution
Examples
Best Practices
- Initialize Early: Call
agentops.init()
before creating your graph - Use Descriptive Names: Name your nodes clearly for better traces
- Handle Errors: Implement error handling in your nodes
- Monitor State Size: Large states can impact performance
- Leverage Conditional Edges: Use them for dynamic workflows