Groq accelerates LLM inference with its ultra-fast Language Processing Unit (LPU) that powers Groq Cloud. Explore the Groq docs to get started with their developer console.

Steps to Integrate Groq with AgentOps

1

Install the AgentOps SDK

pip install agentops
2

Install the Groq SDK

pip install groq
3

Initialize AgentOps and develop with Groq

Make sure to call agentops.init before calling any openai, cohere, crew, etc models.

from groq import Groq
import agentops

agentops.init(<INSERT YOUR API KEY HERE>)
client = Groq(api_key="your_api_key")

# Your code here...

agentops.end_session('Success')

Set your API key as an .env variable for easy access.

AGENTOPS_API_KEY=<YOUR API KEY>
GROQ_API_KEY=<YOUR OPENAI API KEY>

Read more about environment variables in Advanced Configuration

4

Run your Agent

Execute your program and visit app.agentops.ai/drilldown to observe your Agent! šŸ•µļø

After your run, AgentOps prints a clickable url to console linking directly to your session in the Dashboard

Clickable link to session

Full Examples

from groq import Groq
import agentops

agentops.init(<INSERT YOUR API KEY HERE>)
client = Groq(api_key="your_api_key")

response = client.chat.completions.create(
    messages=[
        {
            "role": "user",
            "content": "Explain the importance of low latency LLMs",
        }
    ],
    model="llama3-8b-8192",
)

print(response.choices[0].message.content)
agentops.end_session('Success')

Multi Agent Example

This notebook demonstrates how to use AgentOps with a multi-agent system using Groq.