Module: LLM
- Defined in:
- lib/llm.rb,
lib/llm/agent.rb,
lib/llm/error.rb,
lib/llm/skill.rb,
lib/llm/buffer.rb,
lib/llm/stream.rb,
lib/llm/tracer.rb,
lib/llm/context.rb,
lib/llm/message.rb,
lib/llm/session.rb,
lib/llm/version.rb,
lib/llm/contract.rb,
lib/llm/response.rb,
lib/llm/tracer/null.rb,
lib/llm/eventhandler.rb,
lib/llm/json_adapter.rb,
lib/llm/providers/xai.rb,
lib/llm/providers/zai.rb,
lib/llm/tracer/logger.rb,
lib/llm/providers/google.rb,
lib/llm/providers/ollama.rb,
lib/llm/providers/openai.rb,
lib/llm/tracer/langsmith.rb,
lib/llm/tracer/telemetry.rb,
lib/llm/providers/deepseek.rb,
lib/llm/providers/llamacpp.rb,
lib/llm/providers/anthropic.rb
Defined Under Namespace
Modules: ActiveRecord, Contract, EventStream, Sequel, Utils Classes: Agent, Anthropic, Buffer, Context, Cost, DeepSeek, Error, EventHandler, File, Function, Google, JSONAdapter, LlamaCpp, MCP, Message, Mime, Model, Multipart, Object, Ollama, OpenAI, Prompt, Provider, Registry, Response, Schema, ServerTool, Skill, Stream, Tool, Tracer, Usage, XAI, ZAI
Constant Summary collapse
Class.new(Error)
- RateLimitError =
HTTPTooManyRequests
Class.new(Error)
- ServerError =
HTTPServerError
Class.new(Error)
- FormatError =
When an given an input object that is not understood
Class.new(Error)
- PromptError =
When given a prompt object that is not understood
Class.new(FormatError)
- InvalidRequestError =
When given an invalid request
Class.new(Error)
- ContextWindowError =
When the context window is exceeded
Class.new(InvalidRequestError)
- ToolLoopError =
When stuck in a tool call loop
Class.new(Error)
- Interrupt =
When a request is interrupted
Class.new(Error)
- NoSuchToolError =
When a tool call cannot be mapped to a local tool
Class.new(Error)
- NoSuchModelError =
When Registry can’t map a model
Class.new(Error)
- NoSuchRegistryError =
When Registry can’t map a registry
Class.new(Error)
- Bot =
Backward-compatible alias
Context- Session =
Deprecated.
Use Context instead. Scheduled for removal in v6.0.
Backward-compatible alias for LLM::Context
Context- VERSION =
"4.21.0"
Class Method Summary collapse
-
.anthropic ⇒ Anthropic
A new instance of Anthropic.
- .clients ⇒ Object private
- .deepseek ⇒ LLM::DeepSeek
- .File(obj) ⇒ LLM::File
-
.function(key, &b) ⇒ LLM::Function
Define a function.
-
.google ⇒ Google
A new instance of Google.
-
.json ⇒ Class
Returns the JSON adapter used by the library.
-
.json=(adapter) ⇒ void
Sets the JSON adapter used by the library.
- .llamacpp(key: nil) ⇒ LLM::LlamaCpp
-
.lock(name) ⇒ void
Provides a thread-safe lock.
- .mcp(llm = nil) ⇒ LLM::MCP
-
.ollama(key: nil) ⇒ Ollama
A new instance of Ollama.
-
.openai ⇒ OpenAI
A new instance of OpenAI.
- .registry_for(llm) ⇒ LLM::Object
-
.xai ⇒ XAI
A new instance of XAI.
-
.zai ⇒ ZAI
A new instance of ZAI.
Class Method Details
.anthropic ⇒ Anthropic
Returns a new instance of Anthropic.
100 101 102 103 |
# File 'lib/llm.rb', line 100 def anthropic(**) lock(:require) { require_relative "llm/providers/anthropic" unless defined?(LLM::Anthropic) } LLM::Anthropic.new(**) end |
.clients ⇒ Object
This method is part of a private API. You should avoid using this method if possible, as it may be removed or be changed in the future.
50 |
# File 'lib/llm.rb', line 50 def self.clients = @clients |
.deepseek ⇒ LLM::DeepSeek
132 133 134 135 |
# File 'lib/llm.rb', line 132 def deepseek(**) lock(:require) { require_relative "llm/providers/deepseek" unless defined?(LLM::DeepSeek) } LLM::DeepSeek.new(**) end |
.File(obj) ⇒ LLM::File
82 83 84 85 86 87 88 89 90 91 |
# File 'lib/llm/file.rb', line 82 def LLM.File(obj) case obj when File obj.close unless obj.closed? LLM.File(obj.path) when LLM::File, LLM::Response then obj when String then LLM::File.new(obj) else raise TypeError, "don't know how to handle #{obj.class} objects" end end |
.function(key, &b) ⇒ LLM::Function
Define a function
193 194 195 |
# File 'lib/llm.rb', line 193 def function(key, &b) LLM::Function.new(key, &b) end |
.google ⇒ Google
Returns a new instance of Google.
108 109 110 111 |
# File 'lib/llm.rb', line 108 def google(**) lock(:require) { require_relative "llm/providers/google" unless defined?(LLM::Google) } LLM::Google.new(**) end |
.json ⇒ Class
Returns the JSON adapter used by the library
69 70 71 |
# File 'lib/llm.rb', line 69 def json @json ||= JSONAdapter::JSON end |
.json=(adapter) ⇒ void
This should be set once from the main thread when your program starts. Defaults to LLM::JSONAdapter::JSON.
This method returns an undefined value.
Sets the JSON adapter used by the library
81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 |
# File 'lib/llm.rb', line 81 def json=(adapter) @json = case adapter.to_s when "JSON", "json" then JSONAdapter::JSON when "Oj", "oj" then JSONAdapter::Oj when "Yajl", "yajl" then JSONAdapter::Yajl else is_class = Class === adapter is_subclass = is_class && adapter.ancestors.include?(LLM::JSONAdapter) if is_subclass adapter else raise TypeError, "Adapter must be a subclass of LLM::JSONAdapter" end end end |
.llamacpp(key: nil) ⇒ LLM::LlamaCpp
124 125 126 127 |
# File 'lib/llm.rb', line 124 def llamacpp(key: nil, **) lock(:require) { require_relative "llm/providers/llamacpp" unless defined?(LLM::LlamaCpp) } LLM::LlamaCpp.new(key:, **) end |
.lock(name) ⇒ void
This method returns an undefined value.
Provides a thread-safe lock
202 |
# File 'lib/llm.rb', line 202 def lock(name, &) = @monitors[name].synchronize(&) |
.mcp(llm = nil) ⇒ LLM::MCP
174 175 176 |
# File 'lib/llm.rb', line 174 def mcp(llm = nil, **) LLM::MCP.new(llm, **) end |
.ollama(key: nil) ⇒ Ollama
Returns a new instance of Ollama.
116 117 118 119 |
# File 'lib/llm.rb', line 116 def ollama(key: nil, **) lock(:require) { require_relative "llm/providers/ollama" unless defined?(LLM::Ollama) } LLM::Ollama.new(key:, **) end |
.openai ⇒ OpenAI
Returns a new instance of OpenAI.
140 141 142 143 |
# File 'lib/llm.rb', line 140 def openai(**) lock(:require) { require_relative "llm/providers/openai" unless defined?(LLM::OpenAI) } LLM::OpenAI.new(**) end |
.registry_for(llm) ⇒ LLM::Object
56 57 58 59 60 61 |
# File 'lib/llm.rb', line 56 def self.registry_for(llm) lock(:registry) do name = Symbol === llm ? llm : llm.name @registry[name] ||= Registry.for(name) end end |