Usage
Tracking LLM Calls
Tracking LLM Calls using the AgentOps SDK
How it works
When the AgentOps SDK detects the openai
, litellm
, or cohere
as installed modules, it will automatically
start tracking their usage. No further work is required from you! 😊
Not working?
Try these steps:
- Make sure you have the latest version of the AgentOps SDK installed. We are constantly updating it to support new LLM libraries and releases.
- Make sure you are calling
agentops.init()
after importing the LLM module but before you are calling the LLM method. - Make sure the
instrument_llm_calls
parameter ofagentops.init()
is set toTrue
(default). - Make sure if you have more than one concurrent session, to patch the LLM call as described here.
Still not working? Please let us know! You can find us on Discord, GitHub, or email us at engineering@agentops.ai.
To get started, just follow the quick start guide.
Was this page helpful?