AgentOps integrates with Llama Stack via its python client to provide observability into applications that leverage it. Llama Stack has comprehensive documentation available as well as a great quickstart guide. You can use this guide to setup the Llama Stack server and client or alternatively use our Docker compose file.Documentation Index
Fetch the complete documentation index at: https://docs.agentops.ai/llms.txt
Use this file to discover all available pages before exploring further.
Adding AgentOps to Llama Stack applications
Add 3 lines of code
Make sure to call
agentops.init before calling any openai, cohere, crew, etc models.Run your 🦙🥞 application
Execute your program and visit app.agentops.ai/drilldown to observe your waterfall! 🕵️

