Skip to main content
Prefer zero setup? Use plog run or import provenlog.auto instead. See Auto-Instrumentation.

Setup

from anthropic import Anthropic
from provenlog.integrations.anthropic import Trail

with Trail(agent_id="my-anthropic-agent") as trail:
    client = trail.wrap(Anthropic())
    response = client.messages.create(
        model="claude-sonnet-4-5-20250929",
        max_tokens=1024,
        messages=[{"role": "user", "content": "Hello!"}]
    )

What gets captured

EventAction TypeDetails
Message creationLLM_CALLModel, messages, parameters
Message responseLLM_RESPONSEContent, token usage, stop reason
StreamingLLM_CALL / LLM_RESPONSESame as above, captured on stream completion

How it works

The Trail uses transparent method wrapping on the Anthropic client. It intercepts messages.create() and messages.stream() calls, captures inputs and outputs, and logs them to the audit trail. The wrapped client behaves identically to the original — ProvenLog never modifies the API call or response.

Configuration

# Simple — uses default embedded mode
trail = Trail(agent_id="my-anthropic-agent")

# With explicit client for custom configuration
from provenlog import ProvenLogClient

client = ProvenLogClient("http://localhost:7600", agent_id="my-agent")
trail = Trail(client=client, agent_id="my-anthropic-agent")