RubyLLM
A delightful Ruby interface to the latest large language models. Stop wrestling with multiple APIs and inconsistent interfaces. RubyLLM gives you a clean, unified way to work with models from OpenAI, Anthropic, and more.
Installation
Add it to your Gemfile:
gem 'ruby_llm'
Or install it yourself:
gem install ruby_llm
Quick Start
RubyLLM makes it dead simple to start chatting with AI models:
require 'ruby_llm'
# Configure your API keys
RubyLLM.configure do |config|
config.openai_api_key = ENV['OPENAI_API_KEY']
config.anthropic_api_key = ENV['ANTHROPIC_API_KEY']
config.default_model = 'gpt-4o-mini' # OpenAI's efficient model
end
# Start a conversation
chat = RubyLLM.chat
response = chat.ask "What's the best way to learn Ruby?"
puts response.content
Available Models
RubyLLM gives you access to the latest models from multiple providers. Check what's available:
# List all available models
RubyLLM.models.all.each do |model|
puts "#{model.display_name} (#{model.provider})"
puts " Context window: #{model.context_window}"
puts " Price: $#{model.input_price_per_million}/M tokens (input)"
puts " $#{model.output_price_per_million}/M tokens (output)"
end
# Get models by type
chat_models = RubyLLM.models.chat_models
= RubyLLM.models.
audio_models = RubyLLM.models.audio_models
image_models = RubyLLM.models.image_models
Having a Conversation
Conversations are simple and natural, with automatic token counting built right in:
chat = RubyLLM.chat model: 'claude-3-opus-20240229'
# Single messages with token tracking
response = chat.ask "What's your favorite Ruby feature?"
puts "Response used #{response.input_tokens} input tokens and #{response.output_tokens} output tokens"
# Multi-turn conversations just work
chat.ask "Can you elaborate on that?"
chat.ask "How does that compare to Python?"
# Stream responses as they come
chat.ask "Tell me a story about a Ruby programmer" do |chunk|
print chunk.content
end
# Get token usage for the whole conversation from the last message
= chat..last
puts "Conversation used #{.input_tokens} input tokens and #{.output_tokens} output tokens"
Choosing the Right Model
RubyLLM gives you easy access to model capabilities:
model = RubyLLM.models.find 'claude-3-opus-20240229'
model.context_window # => 200000
model.max_tokens # => 4096
model.supports_vision? # => true
model.supports_json_mode? # => true
Coming Soon
- Rails integration for seamless database and Active Record support
- Function calling / tool use capabilities
- Automatic retries and error handling
- Much more!
Development
After checking out the repo, run bin/setup
to install dependencies. Then, run bin/console
for an interactive prompt.
Contributing
Bug reports and pull requests are welcome on GitHub at https://github.com/crmne/ruby_llm.
License
Released under the MIT License. See LICENSE.txt for details.