Class: Ragents::Providers::RubyLLM
- Inherits:
-
Ragents::Provider
- Object
- Ragents::Provider
- Ragents::Providers::RubyLLM
- Defined in:
- lib/ragents/providers/ruby_llm.rb
Overview
RubyLLM provider - uses the ruby_llm gem as the underlying LLM backend. This gives access to 500+ models across all major providers: OpenAI, Anthropic, Gemini, Bedrock, DeepSeek, Mistral, Ollama, and more.
Instance Attribute Summary collapse
-
#model ⇒ Object
readonly
Returns the value of attribute model.
Attributes inherited from Ragents::Provider
Instance Method Summary collapse
- #generate(messages:, tools: [], **options) ⇒ Object
-
#initialize(model: nil, **config) ⇒ RubyLLM
constructor
A new instance of RubyLLM.
- #stream(messages:, tools: [], **options, &block) ⇒ Object
Methods inherited from Ragents::Provider
Constructor Details
#initialize(model: nil, **config) ⇒ RubyLLM
Returns a new instance of RubyLLM.
38 39 40 41 42 |
# File 'lib/ragents/providers/ruby_llm.rb', line 38 def initialize(model: nil, **config) require_ruby_llm! super(**config) @model = model || @config[:model] end |
Instance Attribute Details
#model ⇒ Object (readonly)
Returns the value of attribute model.
36 37 38 |
# File 'lib/ragents/providers/ruby_llm.rb', line 36 def model @model end |
Instance Method Details
#generate(messages:, tools: [], **options) ⇒ Object
44 45 46 47 48 49 50 51 52 53 54 55 56 57 |
# File 'lib/ragents/providers/ruby_llm.rb', line 44 def generate(messages:, tools: [], **) chat = build_chat() attach_tools(chat, tools) # Add all messages to the chat = () .each do |msg| chat.(**msg) end # Complete the chat response = chat.complete parse_ruby_llm_response(response) end |
#stream(messages:, tools: [], **options, &block) ⇒ Object
59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 |
# File 'lib/ragents/providers/ruby_llm.rb', line 59 def stream(messages:, tools: [], **, &block) chat = build_chat() attach_tools(chat, tools) = () .each do |msg| chat.(**msg) end # Use RubyLLM streaming response = chat.complete do |chunk| block.call(chunk.content) if block_given? && chunk.content end parse_ruby_llm_response(response) end |