LangChain is a framework for developing applications powered by language models. AgentOps automatically tracks your LangChain agents by integrating its callback handler.
Integrating AgentOps with LangChain involves using the LangchainCallbackHandler.
You don’t need a separate agentops.init() call; the LangchainCallbackHandler initializes the AgentOps client automatically if an API key is provided to it or found in the environment.
Here’s a basic example:
Copy
from langchain_community.chat_models import ChatOpenAIfrom langchain.agents import initialize_agent, AgentType, Tool # Corrected Tool importfrom langchain.tools import DuckDuckGoSearchRun # Example toolfrom agentops.integration.callbacks.langchain import LangchainCallbackHandler# 1. Initialize LangchainCallbackHandler# AGENTOPS_API_KEY can be passed here or loaded from environmenthandler = LangchainCallbackHandler(api_key=AGENTOPS_API_KEY, tags=['LangChain Example'])# 2. Define tools for the agentsearch_tool = DuckDuckGoSearchRun()tools = [ Tool( # Wrap DuckDuckGoSearchRun in a Tool object name="DuckDuckGo Search", func=search_tool.run, description="Useful for when you need to answer questions about current events or the current state of the world." )]# 3. Configure LLM with the AgentOps handler# OPENAI_API_KEY can be passed here or loaded from environmentllm = ChatOpenAI(openai_api_key=OPENAI_API_KEY, callbacks=[handler], model='gpt-3.5-turbo', temperature=0) # Added temperature for reproducibility# 4. Initialize your agent, passing the handler to callbacksagent = initialize_agent( tools, llm, agent=AgentType.CHAT_ZERO_SHOT_REACT_DESCRIPTION, verbose=True, callbacks=[handler], handle_parsing_errors=True)# 5. Run your agenttry: response = agent.run("Who is the current CEO of OpenAI and what is his most recent public statement?") print(response)except Exception as e: print(f"An error occurred: {e}")