Documentation Index
Fetch the complete documentation index at: https://docs.trodo.ai/docs/llms.txt
Use this file to discover all available pages before exploring further.
What auto-instruments
Install@opentelemetry/instrumentation-anthropic (Node) or opentelemetry-instrumentation-anthropic (Python) and every Claude call inside wrapAgent becomes a span.
| Call | Span kind | Auto-extracted |
|---|---|---|
client.messages.create | llm | model, input/output tokens, stop_reason, messages, completion |
client.messages.stream | llm | Same, accumulated across deltas |
client.messages.count_tokens | llm | model, input tokens |
| Tool use (server-side) | llm | Tool definitions in prompt; actual tool runs are your code |
anthropic.
Install
Minimum example
- Node.js
- Python
Tool-use loop
The classic Anthropic tool-use pattern: call, inspectstop_reason, run tools, append results, call again. Each messages.create is its own llm span. Wrap your tool executions so they appear as tool spans between the LLM spans:
llm and tool spans, with input/output visible at every step.
Streaming
messages.stream accumulates input_tokens from message_start and output_tokens from message_delta events. The span closes when the stream iterator finishes.
If you consume the stream and throw partway, the span is still recorded (as error) with tokens seen so far — useful for diagnosing timeouts.
Auto vs manual cheat-table
| Operation | Auto? | If manual, use |
|---|---|---|
messages.create | yes | — |
messages.stream | yes | — |
messages.count_tokens | yes | — |
messages.batches.create | no | withSpan({ kind: 'generic' }) on submit + a second span when you poll results |
| Tool executions | no (your code) | withSpan({ kind: 'tool' }) with setTool(name) |
Prompt caching (cache_control) | partial — tokens include cache_creation + cache_read | Read span.attributes['gen_ai.usage.cache_read_input_tokens'] in the detail drawer |
Gotchas
- Import order matters:
trodo.initbeforeimport Anthropic from '@anthropic-ai/sdk'or the instrumentation may miss the prototype patch. Safest pattern is a top-of-fileimport trodo from 'trodo-node'; trodo.init(...). - Anthropic errors (rate limit, bad model) surface as
status='error'on the span witherror_type='APIError'and the full message. The surrounding run’serror_countrolls up.