LlamaIndex
AgentOps works seamlessly with LlamaIndex, a framework for building context-augmented generative AI applications with LLMs.
LlamaIndex is a framework for building context-augmented generative AI applications with LLMs. AgentOps provides comprehensive observability into your LlamaIndex applications through automatic instrumentation, allowing you to monitor LLM calls, track performance, and analyze your application’s behavior.
Installation
Install AgentOps and the LlamaIndex AgentOps instrumentation package:
Setting Up API Keys
You’ll need an AgentOps API key from your AgentOps Dashboard:
Usage
Simply set the global handler to “agentops” at the beginning of your LlamaIndex application. AgentOps will automatically instrument LlamaIndex to track your LLM interactions and application performance.
What Gets Tracked
When you use AgentOps with LlamaIndex, the following operations are automatically tracked:
- LLM Calls: All interactions with language models including prompts, completions, and token usage
- Embeddings: Vector embedding generation and retrieval operations
- Query Operations: Search and retrieval operations on your indexes
- Performance Metrics: Response times, token costs, and success/failure rates
Additional Resources
For more detailed information about LlamaIndex’s observability features and AgentOps integration, check out the LlamaIndex documentation.