Class: Async::Ollama::Transform

Inherits:
Object
  • Object
show all
Defined in:
lib/async/ollama/transform.rb

Overview

Transforms file content using a local Ollama model.

Useful for intelligently merging template changes into existing files, without overwriting user modifications.

Examples:

Transform a bake.rb to add missing release hooks:

new_content = Async::Ollama::Transform.call(
  File.read("bake.rb"),
  instruction: "Add the after_gem_release hooks if they are not already present.",
  template: File.read(template_path)
)
File.write("bake.rb", new_content)

Constant Summary collapse

SYSTEM_PROMPT =
<<~PROMPT
	You are a precise file transformation tool.
	When given a file and instructions, you output ONLY the transformed file content.
	Do not add any explanation, preamble, summary of changes.
	Output the raw file content exactly as it should be executed.
	Do not include markdown code fences in your output.
	Preserve all existing content unless the instructions explicitly require removal.
	If a reference template is provided, use it as a structural guide for what to add or how to format new content — do not copy it verbatim or replace existing content with it.
PROMPT

Class Method Summary collapse

Instance Method Summary collapse

Constructor Details

#initialize(model: MODEL) ⇒ Transform

Returns a new instance of Transform.



34
35
36
# File 'lib/async/ollama/transform.rb', line 34

def initialize(model: MODEL)
	@model = model
end

Class Method Details

.call(content, **options) ⇒ Object

Convenience class method.



62
63
64
# File 'lib/async/ollama/transform.rb', line 62

def self.call(content, **options)
	new.call(content, **options)
end

Instance Method Details

#call(content, instruction:, template: nil, model: @model) ⇒ Object

Transform file content according to the given instruction.



44
45
46
47
48
49
50
51
52
53
54
55
# File 'lib/async/ollama/transform.rb', line 44

def call(content, instruction:, template: nil, model: @model)
	messages = [
		{role: "system", content: SYSTEM_PROMPT},
		{role: "user", content: build_prompt(content, instruction, template)}
	]
	
	Client.open do |client|
		reply = client.chat(messages, model: model)
		
		return strip_fences(reply.response)
	end
end