Class: Ragents::Providers::RubyLLM

Inherits:
Ragents::Provider show all
Defined in:
lib/ragents/providers/ruby_llm.rb

Overview

RubyLLM provider - uses the ruby_llm gem as the underlying LLM backend. This gives access to 500+ models across all major providers: OpenAI, Anthropic, Gemini, Bedrock, DeepSeek, Mistral, Ollama, and more.

Examples:

Basic usage

# First, configure RubyLLM (e.g., in an initializer)
RubyLLM.configure do |config|
  config.openai_api_key = ENV["OPENAI_API_KEY"]
  config.anthropic_api_key = ENV["ANTHROPIC_API_KEY"]
end

# Then use with Ragents
provider = Ragents::Providers::RubyLLM.new(model: "gpt-4o")
agent = MyAgent.new(provider: provider)

Using different models

# OpenAI
provider = Ragents::Providers::RubyLLM.new(model: "gpt-4o")

# Anthropic Claude
provider = Ragents::Providers::RubyLLM.new(model: "claude-sonnet-4-20250514")

# Google Gemini
provider = Ragents::Providers::RubyLLM.new(model: "gemini-2.0-flash")

# Local Ollama
provider = Ragents::Providers::RubyLLM.new(model: "llama3.2")

See Also:

Instance Attribute Summary collapse

Attributes inherited from Ragents::Provider

#config

Instance Method Summary collapse

Methods inherited from Ragents::Provider

#name

Constructor Details

#initialize(model: nil, **config) ⇒ RubyLLM

Returns a new instance of RubyLLM.



38
39
40
41
42
# File 'lib/ragents/providers/ruby_llm.rb', line 38

def initialize(model: nil, **config)
  require_ruby_llm!
  super(**config)
  @model = model || @config[:model]
end

Instance Attribute Details

#modelObject (readonly)

Returns the value of attribute model.



36
37
38
# File 'lib/ragents/providers/ruby_llm.rb', line 36

def model
  @model
end

Instance Method Details

#generate(messages:, tools: [], **options) ⇒ Object



44
45
46
47
48
49
50
51
52
53
54
55
56
57
# File 'lib/ragents/providers/ruby_llm.rb', line 44

def generate(messages:, tools: [], **options)
  chat = build_chat(options)
  attach_tools(chat, tools)

  # Add all messages to the chat
  formatted_messages = format_messages_for_ruby_llm(messages)
  formatted_messages.each do |msg|
    chat.add_message(**msg)
  end

  # Complete the chat
  response = chat.complete
  parse_ruby_llm_response(response)
end

#stream(messages:, tools: [], **options, &block) ⇒ Object



59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
# File 'lib/ragents/providers/ruby_llm.rb', line 59

def stream(messages:, tools: [], **options, &block)
  chat = build_chat(options)
  attach_tools(chat, tools)

  formatted_messages = format_messages_for_ruby_llm(messages)
  formatted_messages.each do |msg|
    chat.add_message(**msg)
  end

  # Use RubyLLM streaming
  response = chat.complete do |chunk|
    block.call(chunk.content) if block_given? && chunk.content
  end

  parse_ruby_llm_response(response)
end