Class: Async::Ollama::Transform
- Inherits:
-
Object
- Object
- Async::Ollama::Transform
- Defined in:
- lib/async/ollama/transform.rb
Overview
Transforms file content using a local Ollama model.
Useful for intelligently merging template changes into existing files, without overwriting user modifications.
Constant Summary collapse
- SYSTEM_PROMPT =
<<~PROMPT You are a precise file transformation tool. When given a file and instructions, you output ONLY the transformed file content. Do not add any explanation, preamble, summary of changes. Output the raw file content exactly as it should be executed. Do not include markdown code fences in your output. Preserve all existing content unless the instructions explicitly require removal. If a reference template is provided, use it as a structural guide for what to add or how to format new content — do not copy it verbatim or replace existing content with it. PROMPT
Class Method Summary collapse
-
.call(content, **options) ⇒ Object
Convenience class method.
Instance Method Summary collapse
-
#call(content, instruction:, template: nil, model: @model) ⇒ Object
Transform file content according to the given instruction.
-
#initialize(model: MODEL) ⇒ Transform
constructor
A new instance of Transform.
Constructor Details
Class Method Details
.call(content, **options) ⇒ Object
Convenience class method.
62 63 64 |
# File 'lib/async/ollama/transform.rb', line 62 def self.call(content, **) new.call(content, **) end |
Instance Method Details
#call(content, instruction:, template: nil, model: @model) ⇒ Object
Transform file content according to the given instruction.
44 45 46 47 48 49 50 51 52 53 54 55 |
# File 'lib/async/ollama/transform.rb', line 44 def call(content, instruction:, template: nil, model: @model) = [ {role: "system", content: SYSTEM_PROMPT}, {role: "user", content: build_prompt(content, instruction, template)} ] Client.open do |client| reply = client.chat(, model: model) return strip_fences(reply.response) end end |