Why Track Memori with AgentOps?
- Memory Recording: Track when conversations are automatically captured and stored
- Context Injection: Monitor how memory is automatically added to LLM context
- Conversation Flow: Understand the complete dialogue history across sessions
- Memory Effectiveness: Analyze how historical context improves response quality
- Performance Impact: Track latency and token usage from memory operations
- Error Tracking: Identify issues with memory recording or context retrieval
Installation
Environment Configuration
Load environment variables and set up API keys.Tracking Automatic Memory Operations
What You’ll See in AgentOps
When using Memori with AgentOps, your dashboard will show:- Conversation Timeline: Complete flow of all conversations with memory context
- Memory Injection Analytics: Track when and how much context is automatically added
- Context Relevance: Monitor the effectiveness of automatic memory retrieval
- Performance Metrics: Latency impact of memory operations on LLM calls
- Token Usage: Track additional tokens consumed by memory context
- Memory Growth: Visualize how conversation history accumulates over time
- Error Tracking: Failed memory operations with full error context
Key Benefits of Memori + AgentOps
- Zero-Effort Memory: Memori automatically handles conversation recording
- Intelligent Context: Only relevant memory is injected into LLM context
- Complete Visibility: AgentOps tracks all automatic memory operations
- Performance Monitoring: Understand the cost/benefit of automatic memory
- Debugging Support: Full traceability of memory decisions and context injection