Integrations
LiteLLM
Call the latest models using the OpenAI format including: Llama, Mistral, Claude, Gemini, Gemma, DALL-E, Whisper
LiteLLM
From LiteLLM’s docs:
Call 400+ LLMs using the same input/output Format
- Translate inputs to provider’s
completion
,embedding
, andimage_generation
endpoints - Consistent output. Text responses will always be available at
['choices'][0]['message']['content']
. - Retry/fallback logic across multiple deployments (e.g. Azure/OpenAI)
- Track spend & set budgets per project
LiteLLM also supports many providers.
Using AgentOps with LiteLLM
Requires litellm>=1.3.1
AgentOps requires a minor adjustment to how you call LiteLLM.
python