Class: LLM::Tracer

Inherits:
Object
  • Object
show all
Defined in:
lib/llm/tracer.rb

Overview

The LLM::Tracer is the superclass of all LLM tracers. It can be helpful for implementing instrumentation and hooking into the lifecycle of an LLM request. See LLM::Tracer::Telemetry, and LLM::Tracer::Logger for example tracer implementations.

Direct Known Subclasses

Logger, Null, Telemetry

Defined Under Namespace

Classes: Langsmith, Logger, Null, Telemetry

Constant Summary collapse

FINISH_METADATA_PROC_KEY =
:"llm.tracer.finish_metadata_proc"

Instance Attribute Summary collapse

Instance Method Summary collapse

Constructor Details

#initialize(provider, options = {}) ⇒ Tracer

Returns a new instance of Tracer.

Parameters:

  • provider (LLM::Provider)

    A provider

  • options (Hash) (defaults to: {})

    A hash of options



26
27
28
29
# File 'lib/llm/tracer.rb', line 26

def initialize(provider, options = {})
  @llm = provider
  @options = {}
end

Instance Attribute Details

#llmLLM::Provider (readonly)

Returns:



19
20
21
# File 'lib/llm/tracer.rb', line 19

def llm
  @llm
end

Instance Method Details

#consume_extra_inputsHash

Returns and clears extra inputs for the next span. Called by the telemetry tracer when starting a span. Subclasses (e.g. Langsmith) override to return fiber-local inputs; default returns {}.

Returns:

  • (Hash)

    Attribute key => value to set on the span at start



200
201
202
# File 'lib/llm/tracer.rb', line 200

def consume_extra_inputs
  {}
end

#consume_extra_outputsHash

Returns and clears extra outputs for the current span. Called by the telemetry tracer when finishing a span. Subclasses override to return fiber-local outputs; default returns {}.

Returns:

  • (Hash)

    Attribute key => value to set on the span at finish



210
211
212
# File 'lib/llm/tracer.rb', line 210

def consume_extra_outputs
  {}
end

#consume_request_metadataHash

Consume and clear per-request metadata. Called by the telemetry tracer at span start.

Returns:

  • (Hash)


232
233
234
235
236
237
# File 'lib/llm/tracer.rb', line 232

def 
  key = 
  data = thread[key] || {}
  thread[key] = nil
  data
end

#current_extraHash

Returns the current extra bag (metadata, inputs, outputs) for the current thread/trace. Used by subclasses; default returns empty hashes.

Returns:

  • (Hash)

    { metadata: {}, inputs: {}, outputs: {} }



190
191
192
# File 'lib/llm/tracer.rb', line 190

def current_extra
  {}
end

#flush!nil

Note:

This method is only implemented by the Telemetry tracer. It is a noop for other tracers.

Flush the tracer

Returns:

  • (nil)


146
147
148
# File 'lib/llm/tracer.rb', line 146

def flush!
  nil
end

#inspectString

Returns:

  • (String)


130
131
132
# File 'lib/llm/tracer.rb', line 130

def inspect
  "#<#{self.class.name}:0x#{object_id.to_s(16)} @provider=#{@llm.class} @tracer=#{@tracer.inspect}>"
end

#merge_extra(metadata: nil, inputs: nil, outputs: nil) ⇒ self

Merges extra attributes for the current trace/span. Used by applications (e.g. chatbot) to add metadata, span inputs, or span outputs to the next span or to the trace. No-op by default; Langsmith merges into fiber-local storage and emits them as langsmith/GenAI attributes.

Parameters:

  • metadata (Hash, nil) (defaults to: nil)

    Key-value pairs merged into trace/span metadata (e.g. langsmith.metadata.*).

  • inputs (Hash, nil) (defaults to: nil)

    Key-value pairs set on the next span at start (e.g. gen_ai.input.messages). Consumed when the span is created.

  • outputs (Hash, nil) (defaults to: nil)

    Key-value pairs set on the current span at finish (e.g. gen_ai.output.messages). Must be set before the request finishes (e.g. in a block passed to the provider).

Returns:

  • (self)


165
166
167
# File 'lib/llm/tracer.rb', line 165

def merge_extra(metadata: nil, inputs: nil, outputs: nil)
  self
end

#on_request_error(ex:, span:) ⇒ void

This method returns an undefined value.

Called when an LLM provider request fails.

Parameters:

Raises:

  • (NotImplementedError)


59
60
61
# File 'lib/llm/tracer.rb', line 59

def on_request_error(ex:, span:)
  raise NotImplementedError, "#{self.class} does not implement '#{__method__}'"
end

#on_request_finish(operation:, res:, model: nil, span: nil, outputs: nil, metadata: nil) ⇒ void

This method returns an undefined value.

Called after an LLM provider request succeeds.

Parameters:

  • operation (String)
  • res (LLM::Response)
  • span (Object, nil) (defaults to: nil)
  • model (String) (defaults to: nil)
  • outputs (Hash, nil) (defaults to: nil)

    Optional span attributes (e.g. gen_ai.output.messages) from llm.rb or caller.

  • metadata (Hash, nil) (defaults to: nil)

    Optional metadata (emitted as langsmith.metadata.*) from llm.rb or caller.

Raises:

  • (NotImplementedError)


50
51
52
# File 'lib/llm/tracer.rb', line 50

def on_request_finish(operation:, res:, model: nil, span: nil, outputs: nil, metadata: nil)
  raise NotImplementedError, "#{self.class} does not implement '#{__method__}'"
end

#on_request_start(operation:, model: nil, inputs: nil) ⇒ void

This method returns an undefined value.

Called before an LLM provider request is executed.

Parameters:

  • operation (String)
  • model (String) (defaults to: nil)
  • inputs (Hash, nil) (defaults to: nil)

    Optional span attributes (e.g. gen_ai.input.messages) from llm.rb or caller.

Raises:

  • (NotImplementedError)


37
38
39
# File 'lib/llm/tracer.rb', line 37

def on_request_start(operation:, model: nil, inputs: nil)
  raise NotImplementedError, "#{self.class} does not implement '#{__method__}'"
end

#on_tool_error(ex:, span:) ⇒ void

This method returns an undefined value.

Called when a local tool/function raises.

Parameters:

  • ex (Exception)

    The raised error.

  • span (Object, nil)

    The span/context object returned by #on_tool_start.

Raises:

  • (NotImplementedError)


96
97
98
# File 'lib/llm/tracer.rb', line 96

def on_tool_error(ex:, span:)
  raise NotImplementedError, "#{self.class} does not implement '#{__method__}'"
end

#on_tool_finish(result:, span:) ⇒ void

This method returns an undefined value.

Called after a local tool/function succeeds.

Parameters:

Raises:

  • (NotImplementedError)


85
86
87
# File 'lib/llm/tracer.rb', line 85

def on_tool_finish(result:, span:)
  raise NotImplementedError, "#{self.class} does not implement '#{__method__}'"
end

#on_tool_start(id:, name:, arguments:, model:) ⇒ void

This method returns an undefined value.

Called before a local tool/function executes.

Parameters:

  • id (String)

    The tool call ID assigned by the model/provider

  • name (String)

    The tool (function) name.

  • arguments (Hash)

    The parsed tool arguments.

  • model (String)

    The model name

Raises:

  • (NotImplementedError)


74
75
76
# File 'lib/llm/tracer.rb', line 74

def on_tool_start(id:, name:, arguments:, model:)
  raise NotImplementedError, "#{self.class} does not implement '#{__method__}'"
end

#set_finish_metadata_proc(proc) ⇒ self

Optional: set a proc to supply metadata when the next chat span finishes. The proc is called with the response (res) and should return a Hash of metadata (e.g. { intent: “…”, confidence: 1.0 }) to merge onto the span as langsmith.metadata.*. Cleared after use. Used by apps to attach routing/intent that is only known after the response.

Parameters:

  • proc (Proc, nil)

    (res) -> Hash or nil

Returns:

  • (self)


178
179
180
181
# File 'lib/llm/tracer.rb', line 178

def (proc)
  thread[FINISH_METADATA_PROC_KEY] = proc
  self
end

#set_request_metadata(metadata) ⇒ nil

Store per-request metadata (e.g. user_input) to be consumed by tracers when starting the next span. Used for plain-text input.value / output.value.

Parameters:

  • metadata (Hash)

    e.g. { user_input: “the user question” }

Returns:

  • (nil)


220
221
222
223
224
225
226
# File 'lib/llm/tracer.rb', line 220

def ()
  return nil unless  && !.empty?
  key = 
  current = thread[key] || {}
  thread[key] = current.merge(.compact)
  nil
end

#spansArray

Returns:

  • (Array)


136
137
138
# File 'lib/llm/tracer.rb', line 136

def spans
  []
end

#start_trace(trace_group_id: nil, name: "llm", attributes: {}, metadata: nil) ⇒ self

Opens a trace group so subsequent LLM spans share the same OpenTelemetry trace_id (and appear as one trace in backends like Langfuse). When trace_group_id is a string, it is used to derive the trace_id.

Parameters:

  • trace_group_id (String, nil) (defaults to: nil)

    Optional. When present, converted to a 16-byte trace_id so all spans created until #stop_trace are grouped in one trace.

  • name (String) (defaults to: "llm")

    Name for the root span (e.g. “chatbot.turn”).

  • attributes (Hash) (defaults to: {})

    OpenTelemetry attributes to set on the root span.

  • metadata (Hash, nil) (defaults to: nil)

    Optional. Trace-level metadata merged into the trace (e.g. langsmith.metadata.*). Only used by tracers that support it (e.g. Langsmith).

Returns:

  • (self)


116
117
118
# File 'lib/llm/tracer.rb', line 116

def start_trace(trace_group_id: nil, name: "llm", attributes: {}, metadata: nil)
  self
end

#stop_traceself

Finishes the trace group started by #start_trace. Safe to call even if no trace is active.

Returns:

  • (self)


124
125
126
# File 'lib/llm/tracer.rb', line 124

def stop_trace
  self
end