Prefer zero setup? Use
plog run or import provenlog.auto instead. See Auto-Instrumentation.Setup
What gets captured
| Event | Action Type | Details |
|---|---|---|
| Message creation | LLM_CALL | Model, messages, parameters |
| Message response | LLM_RESPONSE | Content, token usage, stop reason |
| Streaming | LLM_CALL / LLM_RESPONSE | Same as above, captured on stream completion |
How it works
The Trail uses transparent method wrapping on the Anthropic client. It interceptsmessages.create() and messages.stream() calls, captures inputs and outputs, and logs them to the audit trail.
The wrapped client behaves identically to the original — ProvenLog never modifies the API call or response.