Class: LLM::Agent
- Inherits:
-
Object
- Object
- LLM::Agent
- Defined in:
- lib/llm/agent.rb
Overview
LLM::Agent provides a class-level DSL for defining reusable, preconfigured assistants with defaults for model, tools, schema, and instructions.
It wraps the same stateful runtime surface as LLM::Context: message history, usage, persistence, streaming parameters, and provider-backed requests still flow through an underlying context. The defining behavior of an agent is that it automatically resolves pending tool calls for you during ‘talk` and `respond`, instead of leaving tool loops to the caller.
Notes:
-
Instructions are injected once unless a system message is already present.
-
An agent automatically executes tool loops (unlike LLM::Context).
-
The automatic tool loop enables the wrapped context’s ‘guard` by default. The built-in LLM::LoopGuard detects repeated tool-call patterns and blocks stuck execution before more tool work is queued.
-
The default tool attempt budget is ‘25`. After that, the agent sends advisory tool errors back through the model and keeps the loop in-band. Set `tool_attempts: nil` to disable that advisory behavior.
-
Tool loop execution can be configured with ‘concurrency :call`, `:thread`, `:task`, `:fiber`, `:ractor`, or a list of queued task types such as `[:thread, :ractor]`.
Instance Attribute Summary collapse
-
#llm ⇒ LLM::Provider
readonly
Returns a provider.
Class Method Summary collapse
-
.concurrency(concurrency = nil) ⇒ Symbol, ...
Set or get the tool execution concurrency.
-
.instructions(instructions = nil) ⇒ String?
Set or get the default instructions.
-
.model(model = nil) ⇒ String?
Set or get the default model.
-
.schema(schema = nil) ⇒ #to_json?
Set or get the default schema.
-
.skills(*skills) ⇒ Array<String>?
Set or get the default skills.
-
.tools(*tools) ⇒ Array<LLM::Function>
Set or get the default tools.
-
.tracer(tracer = nil, &block) ⇒ LLM::Tracer, ...
Set or get the default tracer.
Instance Method Summary collapse
- #call ⇒ Object
-
#concurrency ⇒ Symbol, ...
Returns the configured tool execution concurrency.
- #context_window ⇒ Integer
- #cost ⇒ LLM::Cost
- #deserialize(**kw) ⇒ Object (also: #restore)
- #functions ⇒ Array<LLM::Function>
-
#image_url(url) ⇒ LLM::Object
Returns a tagged object.
-
#initialize(llm, params = {}) ⇒ Agent
constructor
A new instance of Agent.
- #inspect ⇒ String
-
#interrupt! ⇒ nil
(also: #cancel!)
Interrupt the active request, if any.
-
#local_file(path) ⇒ LLM::Object
Returns a tagged object.
- #messages ⇒ LLM::Buffer<LLM::Message>
- #mode ⇒ Symbol
-
#model ⇒ String
Returns the model an Agent is actively using.
- #prompt(&b) ⇒ LLM::Prompt (also: #build_prompt)
-
#remote_file(res) ⇒ LLM::Object
Returns a tagged object.
-
#respond(prompt, params = {}) ⇒ LLM::Response
Maintain a conversation via the responses API.
- #returns ⇒ Array<LLM::Function::Return>
- #serialize(**kw) ⇒ void (also: #save)
-
#talk(prompt, params = {}) ⇒ LLM::Response
(also: #chat)
Maintain a conversation via the chat completions API.
- #to_h ⇒ Hash
- #to_json ⇒ String
-
#tracer ⇒ LLM::Tracer
Returns an LLM tracer.
- #usage ⇒ LLM::Object
- #wait ⇒ Array<LLM::Function::Return>
Constructor Details
#initialize(llm, params = {}) ⇒ Agent
Returns a new instance of Agent.
155 156 157 158 159 160 161 162 |
# File 'lib/llm/agent.rb', line 155 def initialize(llm, params = {}) defaults = {model: self.class.model, tools: self.class.tools, skills: self.class.skills, schema: self.class.schema}.compact @concurrency = params.delete(:concurrency) || self.class.concurrency @llm = llm tracer = params.key?(:tracer) ? params.delete(:tracer) : self.class.tracer @tracer = resolve_option(tracer) unless tracer.nil? @ctx = LLM::Context.new(llm, defaults.merge({guard: true}).merge(params)) end |
Instance Attribute Details
#llm ⇒ LLM::Provider (readonly)
Returns a provider
44 45 46 |
# File 'lib/llm/agent.rb', line 44 def llm @llm end |
Class Method Details
.concurrency(concurrency = nil) ⇒ Symbol, ...
Set or get the tool execution concurrency.
117 118 119 120 |
# File 'lib/llm/agent.rb', line 117 def self.concurrency(concurrency = nil) return @concurrency if concurrency.nil? @concurrency = concurrency end |
.instructions(instructions = nil) ⇒ String?
Set or get the default instructions
96 97 98 99 |
# File 'lib/llm/agent.rb', line 96 def self.instructions(instructions = nil) return @instructions if instructions.nil? @instructions = instructions end |
.model(model = nil) ⇒ String?
Set or get the default model
52 53 54 55 |
# File 'lib/llm/agent.rb', line 52 def self.model(model = nil) return @model if model.nil? @model = model end |
.schema(schema = nil) ⇒ #to_json?
Set or get the default schema
85 86 87 88 |
# File 'lib/llm/agent.rb', line 85 def self.schema(schema = nil) return @schema if schema.nil? @schema = schema end |
.skills(*skills) ⇒ Array<String>?
Set or get the default skills
74 75 76 77 |
# File 'lib/llm/agent.rb', line 74 def self.skills(*skills) return @skills if skills.empty? @skills = skills.flatten end |
.tools(*tools) ⇒ Array<LLM::Function>
Set or get the default tools
63 64 65 66 |
# File 'lib/llm/agent.rb', line 63 def self.tools(*tools) return @tools || [] if tools.empty? @tools = tools.flatten end |
.tracer(tracer = nil, &block) ⇒ LLM::Tracer, ...
Set or get the default tracer.
When a block is provided, it is stored and evaluated lazily against the agent instance during initialization so it can build a tracer from the resolved provider.
137 138 139 140 |
# File 'lib/llm/agent.rb', line 137 def self.tracer(tracer = nil, &block) return @tracer if tracer.nil? && !block @tracer = block || tracer end |
Instance Method Details
#call ⇒ Object
228 229 230 |
# File 'lib/llm/agent.rb', line 228 def call(...) @tracer ? @llm.with_tracer(@tracer) { @ctx.call(...) } : @ctx.call(...) end |
#concurrency ⇒ Symbol, ...
Returns the configured tool execution concurrency.
312 313 314 |
# File 'lib/llm/agent.rb', line 312 def concurrency @concurrency end |
#context_window ⇒ Integer
326 327 328 |
# File 'lib/llm/agent.rb', line 326 def context_window @ctx.context_window end |
#deserialize(**kw) ⇒ Object Also known as: restore
361 362 363 |
# File 'lib/llm/agent.rb', line 361 def deserialize(**kw) @ctx.deserialize(**kw) end |
#functions ⇒ Array<LLM::Function>
214 215 216 |
# File 'lib/llm/agent.rb', line 214 def functions @tracer ? @llm.with_tracer(@tracer) { @ctx.functions } : @ctx.functions end |
#image_url(url) ⇒ LLM::Object
Returns a tagged object
267 268 269 |
# File 'lib/llm/agent.rb', line 267 def image_url(url) @ctx.image_url(url) end |
#inspect ⇒ String
345 346 347 348 |
# File 'lib/llm/agent.rb', line 345 def inspect "#<#{self.class.name}:0x#{object_id.to_s(16)} " \ "@llm=#{@llm.class}, @mode=#{mode.inspect}, @messages=#{.inspect}>" end |
#interrupt! ⇒ nil Also known as: cancel!
Interrupt the active request, if any.
248 249 250 |
# File 'lib/llm/agent.rb', line 248 def interrupt! @ctx.interrupt! end |
#local_file(path) ⇒ LLM::Object
Returns a tagged object
276 277 278 |
# File 'lib/llm/agent.rb', line 276 def local_file(path) @ctx.local_file(path) end |
#mode ⇒ Symbol
305 306 307 |
# File 'lib/llm/agent.rb', line 305 def mode @ctx.mode end |
#model ⇒ String
Returns the model an Agent is actively using
299 300 301 |
# File 'lib/llm/agent.rb', line 299 def model @ctx.model end |
#prompt(&b) ⇒ LLM::Prompt Also known as: build_prompt
257 258 259 |
# File 'lib/llm/agent.rb', line 257 def prompt(&b) @ctx.prompt(&b) end |
#remote_file(res) ⇒ LLM::Object
Returns a tagged object
285 286 287 |
# File 'lib/llm/agent.rb', line 285 def remote_file(res) @ctx.remote_file(res) end |
#respond(prompt, params = {}) ⇒ LLM::Response
Not all LLM providers support this API
Maintain a conversation via the responses API. This method immediately sends a request to the LLM and returns the response.
202 203 204 |
# File 'lib/llm/agent.rb', line 202 def respond(prompt, params = {}) run_loop(:respond, prompt, params) end |
#returns ⇒ Array<LLM::Function::Return>
221 222 223 |
# File 'lib/llm/agent.rb', line 221 def returns @ctx.returns end |
#serialize(**kw) ⇒ void Also known as: save
This method returns an undefined value.
353 354 355 |
# File 'lib/llm/agent.rb', line 353 def serialize(**kw) @ctx.serialize(**kw) end |
#talk(prompt, params = {}) ⇒ LLM::Response Also known as: chat
Maintain a conversation via the chat completions API. This method immediately sends a request to the LLM and returns the response.
180 181 182 |
# File 'lib/llm/agent.rb', line 180 def talk(prompt, params = {}) run_loop(:talk, prompt, params) end |
#to_h ⇒ Hash
333 334 335 |
# File 'lib/llm/agent.rb', line 333 def to_h @ctx.to_h end |
#to_json ⇒ String
339 340 341 |
# File 'lib/llm/agent.rb', line 339 def to_json(...) to_h.to_json(...) end |
#tracer ⇒ LLM::Tracer
Returns an LLM tracer
292 293 294 |
# File 'lib/llm/agent.rb', line 292 def tracer @tracer || @ctx.tracer end |
#wait ⇒ Array<LLM::Function::Return>
235 236 237 |
# File 'lib/llm/agent.rb', line 235 def wait(...) @tracer ? @llm.with_tracer(@tracer) { @ctx.wait(...) } : @ctx.wait(...) end |