Class: RubyLLM::Agents::Base

Inherits:
BaseAgent show all
Extended by:
CallbacksDSL
Includes:
CallbacksExecution
Defined in:
lib/ruby_llm/agents/core/base.rb

Overview

Base class for LLM-powered conversational agents

Inherits from BaseAgent to use the middleware pipeline architecture while adding callback hooks for custom preprocessing and postprocessing.

Examples:

Creating an agent

class SearchAgent < ApplicationAgent
  model "gpt-4o"
  temperature 0.0
  timeout 30
  cache_for 1.hour

  param :query, required: true
  param :limit, default: 10

  def system_prompt
    "You are a search assistant..."
  end

  def user_prompt
    "Search for: #{query}"
  end
end

With callbacks

class SafeAgent < ApplicationAgent
  before_call :redact_pii
  after_call :log_response

  private

  def redact_pii(context)
    # Custom redaction logic
  end

  def log_response(context, response)
    Rails.logger.info("Response received")
  end
end

Calling an agent

SearchAgent.call(query: "red dress")
SearchAgent.call(query: "red dress", dry_run: true)    # Debug mode
SearchAgent.call(query: "red dress", skip_cache: true) # Bypass cache

See Also:

Constant Summary

Constants included from DSL::Base

DSL::Base::PLACEHOLDER_PATTERN

Constants included from DSL::Caching

DSL::Caching::DEFAULT_CACHE_TTL

Constants included from CacheHelper

CacheHelper::NAMESPACE

Instance Attribute Summary

Attributes inherited from BaseAgent

#client, #model, #temperature, #tracked_tool_calls

Class Method Summary collapse

Instance Method Summary collapse

Methods included from CallbacksDSL

after, after_call, before, before_call, callbacks

Methods inherited from BaseAgent

#agent_cache_key, agent_middleware, aliases, all_agent_names, ask, #assistant_prompt, #cache_key_data, #cache_key_hash, call, #call, config_summary, #initialize, #messages, param, params, #process_response, #resolved_thinking, #schema, stream, streaming, #system_prompt, temperature, thinking, thinking_config, tools, use_middleware, #user_prompt

Methods included from DSL::Base

#active_overrides, #assistant, #assistant_config, #cache_prompts, #clear_override_cache!, #description, #model, #overridable?, #overridable_fields, #prompt, #returns, #schema, #system, #system_config, #timeout, #user, #user_config

Methods included from DSL::Reliability

#circuit_breaker, #circuit_breaker_config, #fallback_models, #fallback_provider, #fallback_providers, #non_fallback_errors, #on_failure, #reliability, #reliability_config, #reliability_configured?, #retries, #retries_config, #retryable_patterns, #total_timeout

Methods included from DSL::Caching

#cache, #cache_enabled?, #cache_for, #cache_key_excludes, #cache_key_includes, #cache_ttl, #caching_config

Methods included from DSL::Queryable

#cost_by_model, #executions, #failures, #last_run, #stats, #total_spent, #with_params

Methods included from DSL::Knowledge

#knowledge_entries, #knowledge_path, #knows

Methods included from CacheHelper

#cache_delete, #cache_exist?, #cache_increment, #cache_key, #cache_read, #cache_store, #cache_write

Methods included from DSL::Knowledge::InstanceMethods

#compiled_knowledge

Constructor Details

This class inherits a constructor from RubyLLM::Agents::BaseAgent

Class Method Details

.agent_typeSymbol

Returns the agent type for conversation agents

Returns:

  • (Symbol)

    :conversation



62
63
64
# File 'lib/ruby_llm/agents/core/base.rb', line 62

def agent_type
  :conversation
end

Instance Method Details

#execute(context) ⇒ void

This method returns an undefined value.

Execute the core LLM call with callback support

This extends BaseAgent’s execute method to add before/after callback hooks for custom preprocessing and postprocessing.

Parameters:



74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
# File 'lib/ruby_llm/agents/core/base.rb', line 74

def execute(context)
  @context = context
  @execution_started_at = context.started_at || Time.current

  # Make context available to Tool instances during tool execution
  previous_context = Thread.current[:ruby_llm_agents_caller_context]
  Thread.current[:ruby_llm_agents_caller_context] = context

  # Run before_call callbacks
  run_callbacks(:before, context)

  # Execute the LLM call
  client = build_client(context)
  response = execute_llm_call(client, context)
  capture_response(response, context)
  processed_content = process_response(response)

  # Run after_call callbacks
  run_callbacks(:after, context, response)

  context.output = build_result(processed_content, response, context)
rescue RubyLLM::Agents::CancelledError
  context.output = Result.new(content: nil, cancelled: true)
rescue RubyLLM::UnauthorizedError, RubyLLM::ForbiddenError => e
  raise_with_setup_hint(e, context)
rescue RubyLLM::ModelNotFoundError => e
  raise_with_model_hint(e, context)
ensure
  Thread.current[:ruby_llm_agents_caller_context] = previous_context
end

#resolved_tenant_idString?

Returns the resolved tenant ID for tracking

Returns:

  • (String, nil)

    The tenant identifier



108
109
110
111
112
113
# File 'lib/ruby_llm/agents/core/base.rb', line 108

def resolved_tenant_id
  tenant = resolve_tenant
  return nil unless tenant

  tenant.is_a?(Hash) ? tenant[:id]&.to_s : nil
end