LlamaIndex
AgentOps works seamlessly with LlamaIndex, a framework for building context-augmented generative AI applications with LLMs.
LlamaIndex is a framework for building context-augmented generative AI applications with LLMs. AgentOps provides observability into your LlamaIndex applications through automatic instrumentation.
Install the AgentOps SDK
Install LlamaIndex AgentOps Instrumentation
Add 2 lines of code
Make sure to call agentops.init
before calling any openai
, cohere
, crew
, etc models.
Set your API key as an .env
variable for easy access.
Read more about environment variables in Advanced Configuration
Run your LlamaIndex application
Execute your program and visit app.agentops.ai/drilldown to observe your LlamaIndex application! 🕵️
After your run, AgentOps prints a clickable url to console linking directly to your session in the Dashboard
Clickable link to session
Usage Pattern
Here’s a simple example of how to use AgentOps with LlamaIndex:
Additional Resources
For more detailed information about LlamaIndex’s observability features and AgentOps integration, check out the LlamaIndex documentation.