Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.trodo.ai/docs/llms.txt

Use this file to discover all available pages before exploring further.

A span is one step inside a run. Use withSpan to wrap any block of code; the dashboard shows it in the run’s waterfall.

Kinds

KindForRequired fields
llmModel callsmodel, inputTokens, outputTokens (auto for supported providers)
toolFunctions the agent invokes (search, DB, API)toolName
retrievalVector search, KB lookup, RAG retriever
agentA nested sub-step rendered as an agent stage
chainA chain / graph step (LangChain, LangGraph)
functionGeneric instrumented function

Example — all fields

await trodo.withSpan({ kind: 'llm', name: 'chat.completions' }, async (span) => {
  // AUTO-populated when using OpenAI / Anthropic / Bedrock / Cohere / Gemini /
  // Vertex / Mistral / LangChain / LlamaIndex / Vercel AI SDK.
  // Set these manually only when calling the provider via raw fetch.
  span.setLlm({
    model: 'gpt-4o-mini',          // auto
    provider: 'openai',             // auto
    inputTokens: 120,               // auto (from response.usage)
    outputTokens: 64,               // auto
    cost: 0.00034,                  // auto (computed from tokens + model)
    temperature: 0.2,               // auto (from request body)
  });

  // Always manual:
  span.setInput({ prompt: '...' });
  span.setOutput({ completion: '...' });
  span.setAttribute('customer_tier', 'enterprise');

  try {
    return await openai.chat.completions.create({ ... });
  } catch (err) {
    span.setError({ type: 'RateLimitError', message: err.message }); // status = 'error'
    throw err;
  }
});

// Tool span — toolName is required:
await trodo.withSpan({ kind: 'tool', name: 'search_kb', toolName: 'search_kb' }, async (span) => {
  span.setInput({ query });
  const results = await kb.search(query);
  span.setOutput({ count: results.length });
  return results;
});

Field reference

SetterDB columnAuto-captured for supported frameworks?
setInput(value)inputno — always manual
setOutput(value)outputno — always manual
setLlm({ model })modelyes
setLlm({ provider })provideryes
setLlm({ inputTokens })input_tokensyes
setLlm({ outputTokens })output_tokensyes
setLlm({ cost })costyes (computed from tokens + model)
setLlm({ temperature })temperatureyes
setTool(name)tool_nameyes (LangChain, LlamaIndex tool calls)
setError({ type, message })error_type, error_message, status='error'yes (on thrown exceptions)
setAttribute(key, value)attributes.<key>no — always manual
span.namenameyes (from the instrumented method name)
span.kindkindyes
span.statusstatusyes (ok on success, error on throw)

Errors

A thrown exception inside withSpan sets status='error' plus error_type and error_message from the exception, then re-raises. Wrap with try/catch if you want to recover; the span is recorded either way.

Next

  • Spans outside wrapAgent — emit spans from files / workers / services that run outside the callback.
  • Patterns — helpers (tool / llm / trace), trackLlmCall, feedback, custom attributes.