Llama Stack
Llama Stack is a framework from Meta AI for building Agentic applications.
AgentOps integrates with Llama Stack via its python client to provide observability into applications that leverage it.
Llama Stack has comprehensive documentation available as well as a great quickstart guide. You can use this guide to setup the Llama Stack server and client or alternatively use our Docker compose file.
Adding AgentOps to Llama Stack applications
Install the AgentOps SDK
Install the Llama Stack Client
Add 3 lines of code
Make sure to call agentops.init
before calling any openai
, cohere
, crew
, etc models.
Set your API key as an .env
variable for easy access.
Read more about environment variables in Advanced Configuration
Run your 🦙🥞 application
Execute your program and visit app.agentops.ai/drilldown to observe your waterfall! 🕵️
After your run, AgentOps prints a clickable url to console linking directly to your session in the Dashboard
Examples
An example notebook is available here to showcase how to use the Llama Stack client with AgentOps.