Class: Async::Ollama::Client
- Inherits:
-
REST::Resource
- Object
- REST::Resource
- Async::Ollama::Client
- Defined in:
- lib/async/ollama/client.rb
Overview
Represents a connection to the Ollama service, providing methods to generate completions, chat, and list models.
Constant Summary collapse
- ENDPOINT =
The default endpoint to connect to.
Async::HTTP::Endpoint.parse("http://localhost:11434")
Instance Method Summary collapse
-
#chat(messages, **options, &block) ⇒ Object
Sends a chat request with the given messages to Ollama.
-
#generate(prompt, **options, &block) ⇒ Object
Generates a response from the given prompt using Ollama.
-
#models ⇒ Object
Retrieves the list of available models from Ollama.
Instance Method Details
#chat(messages, **options, &block) ⇒ Object
Sends a chat request with the given messages to Ollama.
42 43 44 45 46 47 48 49 50 51 52 53 |
# File 'lib/async/ollama/client.rb', line 42 def chat(, **, &block) [:model] ||= MODEL [:messages] = Chat.post(self.with(path: "/api/chat"), ) do |resource, response| if block_given? yield response end Chat.new(resource, value: response.read, metadata: response.headers) end end |
#generate(prompt, **options, &block) ⇒ Object
Generates a response from the given prompt using Ollama.
25 26 27 28 29 30 31 32 33 34 35 36 |
# File 'lib/async/ollama/client.rb', line 25 def generate(prompt, **, &block) [:prompt] = prompt [:model] ||= MODEL Generate.post(self.with(path: "/api/generate"), ) do |resource, response| if block_given? yield response end Generate.new(resource, value: response.read, metadata: response.headers) end end |
#models ⇒ Object
Retrieves the list of available models from Ollama.
57 58 59 |
# File 'lib/async/ollama/client.rb', line 57 def models Models.get(self.with(path: "/api/tags")) end |