Module: Clacky::MessageFormat::OpenAI
- Defined in:
- lib/clacky/message_format/open_ai.rb
Overview
Static helpers for OpenAI-compatible API message format.
The canonical internal @messages format IS OpenAI format, so this module mainly handles response parsing, tool result formatting, and message type identification — minimal transformation needed.
Class Method Summary collapse
-
.build_request_body(messages, model, tools, max_tokens, caching_enabled, vision_supported: true) ⇒ Hash
Build an OpenAI-compatible request body.
-
.content_to_blocks(content, vision_supported:) ⇒ Array<Hash>
Convert canonical content array to OpenAI-compatible block array.
-
.format_tool_results(response, tool_results) ⇒ Array<Hash>
Format tool results into canonical messages to append to @messages.
-
.normalize_block(block, vision_supported:) ⇒ Hash?
Normalize a single canonical content block to OpenAI API format.
-
.normalize_message_content(msg, vision_supported:) ⇒ Hash
Process a single message’s content through the canonical→OpenAI conversion layer.
-
.parse_response(data) ⇒ Hash
Parse OpenAI-compatible API response into canonical internal format.
-
.tool_call_ids(msg) ⇒ Object
Returns the tool_call_ids referenced in a tool result message.
-
.tool_result_message?(msg) ⇒ Boolean
Returns true if the message is a canonical tool result.
Class Method Details
.build_request_body(messages, model, tools, max_tokens, caching_enabled, vision_supported: true) ⇒ Hash
Build an OpenAI-compatible request body.
Messages go through the canonical→OpenAI conversion layer (normalize_messages). For most models this is identity because the internal canonical format IS OpenAI format. The conversion handles one edge case: image_url content blocks are stripped when vision_supported is false (e.g. DeepSeek, Kimi, MiniMax), replacing them with a text placeholder so the API doesn’t reject the request with “unknown variant ‘image_url’”.
47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 |
# File 'lib/clacky/message_format/open_ai.rb', line 47 def build_request_body(, model, tools, max_tokens, caching_enabled, vision_supported: true) = .map { |msg| (msg, vision_supported: vision_supported) } body = { model: model, max_tokens: max_tokens, messages: } if tools&.any? if caching_enabled cached_tools = deep_clone(tools) cached_tools.last[:cache_control] = { type: "ephemeral" } body[:tools] = cached_tools else body[:tools] = tools end end body end |
.content_to_blocks(content, vision_supported:) ⇒ Array<Hash>
Convert canonical content array to OpenAI-compatible block array. Each block goes through normalize_block; nil results are compacted.
90 91 92 |
# File 'lib/clacky/message_format/open_ai.rb', line 90 def content_to_blocks(content, vision_supported:) content.map { |b| normalize_block(b, vision_supported: vision_supported) }.compact end |
.format_tool_results(response, tool_results) ⇒ Array<Hash>
Format tool results into canonical messages to append to @messages.
174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 |
# File 'lib/clacky/message_format/open_ai.rb', line 174 def format_tool_results(response, tool_results) results_map = tool_results.each_with_object({}) { |r, h| h[r[:id]] = r } response[:tool_calls].map do |tc| result = results_map[tc[:id]] raw_content = result ? result[:content] : { error: "Tool result missing" }.to_json # OpenAI tool message content must be a String. # If a tool returned multipart Array blocks (e.g. screenshot image), convert to JSON. content = raw_content.is_a?(Array) ? JSON.generate(raw_content) : raw_content { role: "tool", tool_call_id: tc[:id], content: content } end end |
.normalize_block(block, vision_supported:) ⇒ Hash?
Normalize a single canonical content block to OpenAI API format.
Canonical text blocks pass through (with cache_control preserved). image_url blocks are kept for vision-capable models and replaced with a text placeholder for non-vision models (DeepSeek, Kimi, etc.).
103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 |
# File 'lib/clacky/message_format/open_ai.rb', line 103 def normalize_block(block, vision_supported:) return block unless block.is_a?(Hash) case block[:type] when "text" # Drop empty text blocks — most APIs (Anthropic, DeepSeek, etc.) # reject { type: "text", text: "" }. text = block[:text] return nil if text.nil? || text.empty? result = { type: "text", text: text } result[:cache_control] = block[:cache_control] if block[:cache_control] result when "image_url" if vision_supported block # Pass through — GPT-4V, Gemini, etc. accept image_url else # Replace with text placeholder so the API doesn't reject the # request. The model will still see the context that an image # was present (from file_prompt / system_injected metadata). { type: "text", text: "[Image content removed — current model does not support vision input]" } end else block # Pass through unknown block types (tool_use, tool_result, etc.) end end |
.normalize_message_content(msg, vision_supported:) ⇒ Hash
Process a single message’s content through the canonical→OpenAI conversion layer. For String content this is a no-op; for Array content each block goes through normalize_block.
74 75 76 77 78 79 80 81 82 |
# File 'lib/clacky/message_format/open_ai.rb', line 74 def (msg, vision_supported:) content = msg[:content] return msg unless content.is_a?(Array) blocks = content_to_blocks(content, vision_supported: vision_supported) # Most APIs reject empty content arrays — use a placeholder text block. blocks = [{ type: "text", text: "..." }] if blocks.empty? msg.merge(content: blocks) end |
.parse_response(data) ⇒ Hash
Parse OpenAI-compatible API response into canonical internal format.
135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 |
# File 'lib/clacky/message_format/open_ai.rb', line 135 def parse_response(data) = data["choices"].first["message"] usage = data["usage"] || {} raw_api_usage = usage.dup usage_data = { prompt_tokens: usage["prompt_tokens"], completion_tokens: usage["completion_tokens"], total_tokens: usage["total_tokens"] } usage_data[:api_cost] = usage["cost"] if usage["cost"] usage_data[:cache_creation_input_tokens] = usage["cache_creation_input_tokens"] if usage["cache_creation_input_tokens"] usage_data[:cache_read_input_tokens] = usage["cache_read_input_tokens"] if usage["cache_read_input_tokens"] # OpenRouter stores cache info under prompt_tokens_details if (details = usage["prompt_tokens_details"]) usage_data[:cache_read_input_tokens] = details["cached_tokens"] if details["cached_tokens"].to_i > 0 usage_data[:cache_creation_input_tokens] = details["cache_write_tokens"] if details["cache_write_tokens"].to_i > 0 end result = { content: ["content"], tool_calls: parse_tool_calls(["tool_calls"]), finish_reason: data["choices"].first["finish_reason"], usage: usage_data, raw_api_usage: raw_api_usage } # Preserve reasoning_content (e.g. Kimi/Moonshot extended thinking) result[:reasoning_content] = ["reasoning_content"] if ["reasoning_content"] result end |
.tool_call_ids(msg) ⇒ Object
Returns the tool_call_ids referenced in a tool result message.
21 22 23 24 25 |
# File 'lib/clacky/message_format/open_ai.rb', line 21 def tool_call_ids(msg) return [] unless (msg) [msg[:tool_call_id]] end |
.tool_result_message?(msg) ⇒ Boolean
Returns true if the message is a canonical tool result.
16 17 18 |
# File 'lib/clacky/message_format/open_ai.rb', line 16 def (msg) msg[:role] == "tool" && !msg[:tool_call_id].nil? end |