Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.agentops.ai/llms.txt

Use this file to discover all available pages before exploring further.

AgentOps provides seamless integration with IBM Watsonx.ai Python SDK, allowing you to track and analyze all your Watsonx.ai model interactions automatically.

Installation

pip install agentops ibm-watsonx-ai

Setting Up API Keys

Before using IBM Watsonx.ai with AgentOps, you need to set up your API keys. You can obtain:
  • IBM_WATSONX_API_KEY: From your IBM Cloud account
  • IBM_WATSONX_URL: The URL for your Watsonx.ai instance, typically found in your IBM Cloud dashboard.
  • IBM_WATSONX_PROJECT_ID: The project ID for your Watsonx.ai project, which you can find in the Watsonx.ai console.
  • AGENTOPS_API_KEY: From your AgentOps Dashboard
Then to set them up, you can either export them as environment variables or set them in a .env file.
export IBM_WATSONX_API_KEY="your_ibm_api_key_here"
export IBM_WATSONX_URL="your_ibm_url_here"
export IBM_WATSONX_PROJECT_ID="your_project_id_here"
export AGENTOPS_API_KEY="your_agentops_api_key_here"
Then load the environment variables in your Python code:
from dotenv import load_dotenv
import os

# Load environment variables from .env file
load_dotenv()

# Set up environment variables with fallback values
os.environ["IBM_WATSONX_API_KEY"] = os.getenv("IBM_WATSONX_API_KEY")
os.environ["IBM_WATSONX_URL"] = os.getenv("IBM_WATSONX_URL")
os.environ["IBM_WATSONX_PROJECT_ID"] = os.getenv("IBM_WATSONX_PROJECT_ID")
os.environ["AGENTOPS_API_KEY"] = os.getenv("AGENTOPS_API_KEY")

Usage

Initialize AgentOps at the beginning of your application to automatically track all IBM Watsonx.ai API calls:
import agentops
from ibm_watsonx_ai import Credentials
from ibm_watsonx_ai.foundation_models import ModelInference

# Initialize AgentOps
agentops.init(api_key="")

# Initialize credentials
credentials = Credentials(
    url=os.getenv("IBM_WATSONX_URL"),
    api_key=os.getenv("IBM_WATSONX_API_KEY"),
)

# Project ID
project_id = os.getenv("IBM_WATSONX_PROJECT_ID")

# Create a model instance
model = ModelInference(
    model_id="meta-llama/llama-3-3-70b-instruct",
    credentials=credentials,
    project_id=project_id
)

# Make a completion request
response = model.generate_text("What is artificial intelligence?")
print(f"Generated Text:\n{response}")

# Don't forget to close connection when done
model.close_persistent_connection()

Examples

Watsonx Text Chat

Basic text generation and chat

Watsonx Streaming

Demonstrates streaming responses with Watsonx.ai.

Watsonx Tokenization

Example of text tokenization with Watsonx.ai models.

Additional Resources