Automatic LLM Call Tracking

AgentOps makes tracking LLM calls incredibly simple. Just initialize the SDK with your API key, and AgentOps will automatically track all your LLM calls:

import agentops
from openai import OpenAI

# Initialize AgentOps
agentops.init("your-api-key")

# Make LLM calls as usual - they'll be tracked automatically
client = OpenAI()
response = client.chat.completions.create(
    model="gpt-4o",
    messages=[{"role": "user", "content": "Hello, world!"}]
)

How it works

When the AgentOps SDK detects a supported LLM provider module installed, it will automatically start tracking its usage. No further work is required from you! 😊

Supported LLM Providers

AgentOps supports automatic tracking for many popular LLM providers, including:

  • OpenAI
  • Anthropic
  • Google (Gemini)
  • LiteLLM
  • And more

Not working?

Try these steps:

  1. Make sure you have the latest version of the AgentOps SDK installed. We are constantly updating it to support new LLM libraries and releases.
  2. Make sure you are calling agentops.init() after importing the LLM module but before you are calling the LLM method.
  3. Make sure the instrument_llm_calls parameter of agentops.init() is set to True (default).

Still not working? Please let us know! You can find us on Discord, GitHub, or email us at engineering@agentops.ai.

To get started, just follow the quick start guide.