AgentOps works seamlessly with LlamaIndex, a framework for building context-augmented generative AI applications with LLMs.
LlamaIndex is a framework for building context-augmented generative AI applications with LLMs. AgentOps provides comprehensive observability into your LlamaIndex applications through automatic instrumentation, allowing you to monitor LLM calls, track performance, and analyze your application’s behavior.
Simply set the global handler to “agentops” at the beginning of your LlamaIndex application. AgentOps will automatically instrument LlamaIndex to track your LLM interactions and application performance.
Copy
from llama_index.core import set_global_handlerfrom llama_index.core import VectorStoreIndex, SimpleDirectoryReader# Set the global handler to AgentOps# NOTE: Feel free to set your AgentOps environment variables (e.g., 'AGENTOPS_API_KEY')# as outlined in the AgentOps documentation, or pass the equivalent keyword arguments# anticipated by AgentOps' AOClient as **eval_params in set_global_handler.set_global_handler("agentops")# Your LlamaIndex application code heredocuments = SimpleDirectoryReader("data").load_data()index = VectorStoreIndex.from_documents(documents)# Create a query enginequery_engine = index.as_query_engine()# Query your data - AgentOps will automatically track thisresponse = query_engine.query("What is the main topic of these documents?")print(response)