Track and analyze your LiteLLM calls across multiple providers with AgentOps
AgentOps provides seamless integration with LiteLLM, allowing you to automatically track all your LLM API calls across different providers through a unified interface.
Before using LiteLLM with AgentOps, you need to set up your API keys. You can obtain:
Then to set them up, you can either export them as environment variables or set them in a .env
file.
Then load the environment variables in your Python code:
The simplest way to integrate AgentOps with LiteLLM is to set up the success_callback.
For more information on integrating AgentOps with LiteLLM, refer to the LiteLLM documentation on AgentOps integration.
Track and analyze your LiteLLM calls across multiple providers with AgentOps
AgentOps provides seamless integration with LiteLLM, allowing you to automatically track all your LLM API calls across different providers through a unified interface.
Before using LiteLLM with AgentOps, you need to set up your API keys. You can obtain:
Then to set them up, you can either export them as environment variables or set them in a .env
file.
Then load the environment variables in your Python code:
The simplest way to integrate AgentOps with LiteLLM is to set up the success_callback.
For more information on integrating AgentOps with LiteLLM, refer to the LiteLLM documentation on AgentOps integration.