Class: Fastlane::Actions::OpenaiAskAction
- Inherits:
-
Action
- Object
- Action
- Fastlane::Actions::OpenaiAskAction
- Defined in:
- lib/fastlane/plugin/wpmreleasetoolkit/actions/common/openai_ask_action.rb
Constant Summary collapse
- OPENAI_API_ENDPOINT =
URI('https://api.openai.com/v1/chat/completions').freeze
- DEFAULT_MAX_TOOL_ITERATIONS =
5- DEFAULT_MODEL =
'gpt-4o'- PREDEFINED_PROMPTS =
{ release_notes: <<~PROMPT Act like a mobile app marketer who wants to prepare release notes for Google Play and App Store. Do not write it point by point and keep it under 350 characters. It should be a unique paragraph. When provided a list, use the number of any potential "*" in brackets at the start of each item as indicator of importance. Ignore items starting with "[Internal]", and ignore links to GitHub. PROMPT }.freeze
Documentation collapse
- .authors ⇒ Object
- .available_options ⇒ Object
- .available_prompt_symbols ⇒ Object
- .description ⇒ Object
- .details ⇒ Object
- .examples ⇒ Object
- .is_supported?(_platform) ⇒ Boolean
- .return_value ⇒ Object
Class Method Summary collapse
- .execute_tool_call(tool_call, tool_handlers) ⇒ Object
- .format_message(role:, text:) ⇒ Object
-
.invoke_tool_handler(name:, handler:, args:) ⇒ Object
Invokes a tool handler safely.
- .parse_assistant_message(response) ⇒ Object
- .parse_text_response(response) ⇒ Object
- .request_body(prompt:, question:, model: DEFAULT_MODEL) ⇒ Object
- .request_body_with_messages(messages:, tools:, model: DEFAULT_MODEL) ⇒ Object
- .run(params) ⇒ Object
- .run_with_tools(prompt:, question:, model:, tools:, tool_handlers:, max_tool_iterations:, headers:) ⇒ Object
-
.serialize_tool_result(name:, result:) ⇒ Object
Serializes a tool result to a JSON string.
Class Method Details
.authors ⇒ Object
214 215 216 |
# File 'lib/fastlane/plugin/wpmreleasetoolkit/actions/common/openai_ask_action.rb', line 214 def self. ['Automattic'] end |
.available_options ⇒ Object
282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 |
# File 'lib/fastlane/plugin/wpmreleasetoolkit/actions/common/openai_ask_action.rb', line 282 def self. [ FastlaneCore::ConfigItem.new(key: :prompt, description: 'The internal top-level instructions to give to the model to tell it how to behave. ' \ + "Use a Ruby Symbol from one of [#{available_prompt_symbols}] to use a predefined prompt instead of writing your own", optional: true, default_value: nil, type: String, skip_type_validation: true, verify_block: proc do |value| next if value.is_a?(String) next if PREDEFINED_PROMPTS.include?(value) UI.user_error!("Parameter `prompt` can only be a String or one of the following Symbols: [#{available_prompt_symbols}]") end), FastlaneCore::ConfigItem.new(key: :question, description: 'The user message to ask the question to the OpenAI model', optional: false, default_value: nil, type: String), FastlaneCore::ConfigItem.new(key: :api_token, description: 'The OpenAI API Token to use for the request', env_name: 'OPENAI_API_TOKEN', optional: false, sensitive: true, type: String), FastlaneCore::ConfigItem.new(key: :model, description: 'The OpenAI model to send the request to (e.g. `gpt-4o`, `gpt-4o-mini`, `gpt-4.1`). ' \ "Defaults to `#{DEFAULT_MODEL}`", optional: true, default_value: DEFAULT_MODEL, type: String), FastlaneCore::ConfigItem.new(key: :tools, description: 'Optional array of tool (function-calling) definitions in OpenAI format. ' \ 'When provided, the action runs a tool-use loop', optional: true, default_value: nil, type: Array, verify_block: proc do |value| UI.user_error!('Parameter `tools` must be a non-empty Array when provided') if value.empty? end), FastlaneCore::ConfigItem.new(key: :tool_handlers, description: 'Hash of tool name to a callable (e.g. a Proc) invoked when the model calls that tool. ' \ 'The callable receives the parsed arguments Hash and must return a JSON-serializable value, ' \ 'which is sent back to the model as the tool result', optional: true, default_value: nil, type: Hash, verify_block: proc do |value| non_callable = value.reject { |_k, v| v.respond_to?(:call) } UI.user_error!("Parameter `tool_handlers` values must respond to :call. Non-callable handlers: #{non_callable.keys}") if non_callable.any? end), FastlaneCore::ConfigItem.new(key: :max_tool_iterations, description: 'Maximum number of tool-use loop iterations before the action fails. ' \ 'Only used when `tools` are provided', optional: true, default_value: DEFAULT_MAX_TOOL_ITERATIONS, type: Integer, verify_block: proc do |value| UI.user_error!("Parameter `max_tool_iterations` must be >= 1 (got #{value})") if value < 1 end), ] end |
.available_prompt_symbols ⇒ Object
278 279 280 |
# File 'lib/fastlane/plugin/wpmreleasetoolkit/actions/common/openai_ask_action.rb', line 278 def self.available_prompt_symbols PREDEFINED_PROMPTS.keys.map { |v| "`:#{v}`" }.join(',') end |
.description ⇒ Object
210 211 212 |
# File 'lib/fastlane/plugin/wpmreleasetoolkit/actions/common/openai_ask_action.rb', line 210 def self.description 'Use OpenAI API to generate response to a prompt' end |
.details ⇒ Object
223 224 225 226 227 228 229 230 231 232 233 |
# File 'lib/fastlane/plugin/wpmreleasetoolkit/actions/common/openai_ask_action.rb', line 223 def self.details <<~DETAILS Uses the OpenAI API to generate response to a prompt. Can be used to e.g. ask it to generate Release Notes based on a bullet point technical changelog or similar. When `tools` and `tool_handlers` are provided, the action runs a tool-use (function-calling) loop: on each turn, if the model calls one or more tools, the corresponding handler is invoked locally and its return value is sent back to the model as a `role: tool` message. The loop ends when the model returns a plain text response, or when `max_tool_iterations` is reached. DETAILS end |
.examples ⇒ Object
235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 |
# File 'lib/fastlane/plugin/wpmreleasetoolkit/actions/common/openai_ask_action.rb', line 235 def self.examples [ <<~'EXAMPLE', <<~'EXAMPLE', # Tool-use loop: the model proposes release notes via a tool call; the handler validates # length locally and rejects until the model produces text under the limit. notes = openai_ask( prompt: :release_notes, question: "Write release notes for: #{items}. Call the validate_length tool with your draft and iterate until it accepts.", api_token: get_required_env('OPENAI_API_TOKEN'), tools: [{ type: 'function', function: { name: 'validate_length', description: 'Validates the length of the proposed release notes against a 350-character budget. ' \ 'Returns `{ ok: true, length: }` if the text fits, or `{ ok: false, length:, max: }` otherwise. ' \ 'Call repeatedly with shorter drafts until it returns ok: true.', parameters: { type: 'object', properties: { text: { type: 'string' } }, required: ['text'] } } }], tool_handlers: { 'validate_length' => ->(args) { len = args['text'].length len <= 350 ? { ok: true, length: len } : { ok: false, length: len, max: 350 } } } ) EXAMPLE ] end |
.execute_tool_call(tool_call, tool_handlers) ⇒ Object
143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 |
# File 'lib/fastlane/plugin/wpmreleasetoolkit/actions/common/openai_ask_action.rb', line 143 def self.execute_tool_call(tool_call, tool_handlers) name = tool_call.dig('function', 'name') raw_args = tool_call.dig('function', 'arguments') || '{}' result = begin args = JSON.parse(raw_args) invoke_tool_handler(name: name, handler: tool_handlers[name], args: args) rescue JSON::ParserError # Short-circuit: the handler never sees malformed args. Tell the model the # tool-call payload was invalid so it can retry with valid JSON, and log the # raw arguments locally for debugging without forwarding them to the API. UI.error("Invalid JSON arguments for tool '#{name}'. Raw payload: #{raw_args}") { error: "Invalid JSON arguments for tool '#{name}' — payload could not be parsed. Retry with valid JSON." } end { role: 'tool', tool_call_id: tool_call['id'], content: serialize_tool_result(name: name, result: result) } end |
.format_message(role:, text:) ⇒ Object
114 115 116 117 118 119 120 121 |
# File 'lib/fastlane/plugin/wpmreleasetoolkit/actions/common/openai_ask_action.rb', line 114 def self.(role:, text:) return nil if text.nil? || text.empty? { role: role, content: [{ type: 'text', text: text }] } end |
.invoke_tool_handler(name:, handler:, args:) ⇒ Object
Invokes a tool handler safely. Returns a JSON-serializable value that will be sent back to the model as the ‘content` of a `role: tool` message (the value may be a Hash, Array, scalar, etc. — whatever the handler returns).
-
Missing or non-callable handler: structured ‘{ error: … }` so the model can recover.
-
Handler raised: structured ‘{ error:, exception: }` carrying only the exception class so the model can see the failure category and adjust. The full message and backtrace are logged locally via `UI.error` but NOT forwarded to the model, because tool results are sent to OpenAI and handler exception messages can contain secrets (tokens, file contents, internal API responses). The loop keeps going rather than aborting the lane mid-conversation — the model is the better judge of whether the failure is recoverable than a global `rescue` here.
194 195 196 197 198 199 200 201 202 203 204 |
# File 'lib/fastlane/plugin/wpmreleasetoolkit/actions/common/openai_ask_action.rb', line 194 def self.invoke_tool_handler(name:, handler:, args:) return { error: "No handler defined for tool '#{name}'" } if handler.nil? return { error: "Handler for tool '#{name}' is not callable (got #{handler.class})" } unless handler.respond_to?(:call) begin handler.call(args) rescue StandardError => e UI.error("Handler for tool '#{name}' raised #{e.class}: #{e.}\n#{e.backtrace&.first(5)&.join("\n")}") { error: "Handler for tool '#{name}' raised an exception", exception: e.class.name } end end |
.is_supported?(_platform) ⇒ Boolean
346 347 348 |
# File 'lib/fastlane/plugin/wpmreleasetoolkit/actions/common/openai_ask_action.rb', line 346 def self.is_supported?(_platform) true end |
.parse_assistant_message(response) ⇒ Object
133 134 135 136 137 138 139 140 141 |
# File 'lib/fastlane/plugin/wpmreleasetoolkit/actions/common/openai_ask_action.rb', line 133 def self.(response) case response when Net::HTTPOK json = JSON.parse(response.body) json['choices']&.first&.dig('message') || {} else UI.user_error!("Error in OpenAI API response: #{response}. #{response.body}") end end |
.parse_text_response(response) ⇒ Object
123 124 125 126 127 128 129 130 131 |
# File 'lib/fastlane/plugin/wpmreleasetoolkit/actions/common/openai_ask_action.rb', line 123 def self.parse_text_response(response) case response when Net::HTTPOK json = JSON.parse(response.body) json['choices']&.first&.dig('message', 'content') else UI.user_error!("Error in OpenAI API response: #{response}. #{response.body}") end end |
.request_body(prompt:, question:, model: DEFAULT_MODEL) ⇒ Object
88 89 90 91 92 93 94 95 96 97 98 99 100 |
# File 'lib/fastlane/plugin/wpmreleasetoolkit/actions/common/openai_ask_action.rb', line 88 def self.request_body(prompt:, question:, model: DEFAULT_MODEL) { model: model, response_format: { type: 'text' }, temperature: 1, max_tokens: 2048, top_p: 1, messages: [ (role: 'system', text: prompt), (role: 'user', text: question), ].compact }.to_json end |
.request_body_with_messages(messages:, tools:, model: DEFAULT_MODEL) ⇒ Object
102 103 104 105 106 107 108 109 110 111 112 |
# File 'lib/fastlane/plugin/wpmreleasetoolkit/actions/common/openai_ask_action.rb', line 102 def self.(messages:, tools:, model: DEFAULT_MODEL) { model: model, response_format: { type: 'text' }, temperature: 1, max_tokens: 2048, top_p: 1, messages: , tools: tools }.to_json end |
.return_value ⇒ Object
218 219 220 221 |
# File 'lib/fastlane/plugin/wpmreleasetoolkit/actions/common/openai_ask_action.rb', line 218 def self.return_value 'The response text from the prompt as returned by OpenAI API. ' \ 'When `tools` are provided, returns the assistant content from the first turn that produces a non-tool-call response.' end |
.run(params) ⇒ Object
24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 |
# File 'lib/fastlane/plugin/wpmreleasetoolkit/actions/common/openai_ask_action.rb', line 24 def self.run(params) api_token = params[:api_token] prompt = params[:prompt] prompt = PREDEFINED_PROMPTS[prompt] if PREDEFINED_PROMPTS.key?(prompt) question = params[:question] model = params[:model] || DEFAULT_MODEL tools = params[:tools] # Tool names from the OpenAI API are always JSON strings. Normalize handler keys so # callers can register handlers with either string or symbol keys without surprises. tool_handlers = (params[:tool_handlers] || {}).transform_keys(&:to_s) max_tool_iterations = params[:max_tool_iterations] || DEFAULT_MAX_TOOL_ITERATIONS headers = { 'Content-Type': 'application/json', Authorization: "Bearer #{api_token}" } # Backwards-compatible single-shot path when no tools are provided. if tools.nil? || tools.empty? body = request_body(prompt: prompt, question: question, model: model) response = Net::HTTP.post(OPENAI_API_ENDPOINT, body, headers) return parse_text_response(response) end run_with_tools( prompt: prompt, question: question, model: model, tools: tools, tool_handlers: tool_handlers, max_tool_iterations: max_tool_iterations, headers: headers ) end |
.run_with_tools(prompt:, question:, model:, tools:, tool_handlers:, max_tool_iterations:, headers:) ⇒ Object
59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 |
# File 'lib/fastlane/plugin/wpmreleasetoolkit/actions/common/openai_ask_action.rb', line 59 def self.run_with_tools(prompt:, question:, model:, tools:, tool_handlers:, max_tool_iterations:, headers:) = [ (role: 'system', text: prompt), (role: 'user', text: question), ].compact max_tool_iterations.times do body = (messages: , tools: tools, model: model) response = Net::HTTP.post(OPENAI_API_ENDPOINT, body, headers) = (response) tool_calls = ['tool_calls'] # No tool calls — model produced a final answer. return ['content'] if tool_calls.nil? || tool_calls.empty? # Append the assistant's tool-call message verbatim, then run each handler # and append the corresponding `role: tool` results. << tool_calls.each do |tool_call| << execute_tool_call(tool_call, tool_handlers) end end UI.user_error!( "OpenAI tool-use loop did not terminate after #{max_tool_iterations} iterations. " \ 'Increase `max_tool_iterations` or check that your prompt instructs the model to stop calling tools.' ) end |
.serialize_tool_result(name:, result:) ⇒ Object
Serializes a tool result to a JSON string. Handlers are contracted to return JSON-serializable values, but a buggy handler might return something like a ‘Pathname`, `Proc`, or a custom object whose `to_json` raises. Failing the whole conversation over a serialization error is harsh — instead, log locally and send a structured `{ error: … }` back so the model can recover.
The handler’s class name is exposed (handler authorship is local, not secret) but the exception’s message is NOT forwarded — same reasoning as ‘invoke_tool_handler`: handler-returned objects can carry secrets.
175 176 177 178 179 180 |
# File 'lib/fastlane/plugin/wpmreleasetoolkit/actions/common/openai_ask_action.rb', line 175 def self.serialize_tool_result(name:, result:) JSON.generate(result) rescue StandardError => e UI.error("Could not serialize tool result for '#{name}': #{e.class}: #{e.}. Result class: #{result.class}") JSON.generate({ error: "Tool result for '#{name}' could not be serialized to JSON. Returned class: #{result.class}." }) end |