Class: RubyLLM::Providers::OpenAIResponses
- Inherits:
-
OpenAI
- Object
- OpenAI
- RubyLLM::Providers::OpenAIResponses
- Defined in:
- lib/swarm_sdk/ruby_llm_patches/responses_api_patch.rb
Overview
OpenAI Responses API provider. Uses v1/responses endpoint instead of v1/chat/completions. Inherits from OpenAI and overrides only what differs.
Instance Attribute Summary collapse
-
#responses_config ⇒ Object
readonly
Returns the value of attribute responses_config.
-
#responses_session ⇒ Object
readonly
Returns the value of attribute responses_session.
Instance Method Summary collapse
-
#build_chunk(data) ⇒ Object
Override build_chunk for Responses API streaming events.
-
#complete(messages, tools:, temperature:, model:, params: {}, headers: {}, schema: nil, thinking: nil, &block) ⇒ Object
Override complete to handle response ID failures.
-
#completion_url ⇒ Object
Override endpoint URL.
-
#initialize(config, responses_session = nil, responses_config = {}) ⇒ OpenAIResponses
constructor
A new instance of OpenAIResponses.
-
#parse_completion_response(response) ⇒ Object
Override parse_completion_response for Responses API format.
-
#render_payload(messages, tools:, temperature:, model:, stream: false, schema: nil, thinking: nil) ⇒ Object
Override render_payload for Responses API format.
-
#tool_for(tool) ⇒ Object
Override tool_for for flat format (not nested under ‘function’).
Constructor Details
#initialize(config, responses_session = nil, responses_config = {}) ⇒ OpenAIResponses
Returns a new instance of OpenAIResponses.
99 100 101 102 103 104 105 106 107 108 109 |
# File 'lib/swarm_sdk/ruby_llm_patches/responses_api_patch.rb', line 99 def initialize(config, responses_session = nil, responses_config = {}) @responses_session = responses_session || ResponsesSession.new @responses_config = { stateful: false, store: true, truncation: :disabled, include: [], }.merge(responses_config) super(config) end |
Instance Attribute Details
#responses_config ⇒ Object (readonly)
Returns the value of attribute responses_config.
97 98 99 |
# File 'lib/swarm_sdk/ruby_llm_patches/responses_api_patch.rb', line 97 def responses_config @responses_config end |
#responses_session ⇒ Object (readonly)
Returns the value of attribute responses_session.
97 98 99 |
# File 'lib/swarm_sdk/ruby_llm_patches/responses_api_patch.rb', line 97 def responses_session @responses_session end |
Instance Method Details
#build_chunk(data) ⇒ Object
Override build_chunk for Responses API streaming events
182 183 184 185 186 187 188 |
# File 'lib/swarm_sdk/ruby_llm_patches/responses_api_patch.rb', line 182 def build_chunk(data) if responses_api_event?(data) build_responses_chunk(data) else super end end |
#complete(messages, tools:, temperature:, model:, params: {}, headers: {}, schema: nil, thinking: nil, &block) ⇒ Object
Override complete to handle response ID failures
117 118 119 120 121 122 123 124 |
# File 'lib/swarm_sdk/ruby_llm_patches/responses_api_patch.rb', line 117 def complete(, tools:, temperature:, model:, params: {}, headers: {}, schema: nil, thinking: nil, &block) super rescue BadRequestError => e raise unless response_id_not_found_error?(e) handle_response_id_failure retry end |
#completion_url ⇒ Object
Override endpoint URL
112 113 114 |
# File 'lib/swarm_sdk/ruby_llm_patches/responses_api_patch.rb', line 112 def completion_url "responses" end |
#parse_completion_response(response) ⇒ Object
Override parse_completion_response for Responses API format
143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 |
# File 'lib/swarm_sdk/ruby_llm_patches/responses_api_patch.rb', line 143 def parse_completion_response(response) data = response.body return if data.nil? || !data.is_a?(Hash) || data.empty? case data["status"] when "completed" parse_completed_response(data, response) when "failed" raise ResponseFailedError.new(response, data.dig("error", "message") || "Response failed") when "in_progress", "queued" raise ResponseInProgressError.new(response, "Response still processing: #{data["id"]}") when "cancelled" raise ResponseCancelledError.new(response, "Response was cancelled: #{data["id"]}") when "incomplete" parse_incomplete_response(data, response) else raise Error.new(response, data.dig("error", "message")) if data.dig("error", "message") parse_completed_response(data, response) end end |
#render_payload(messages, tools:, temperature:, model:, stream: false, schema: nil, thinking: nil) ⇒ Object
Override render_payload for Responses API format
127 128 129 130 131 132 133 134 135 136 137 138 139 140 |
# File 'lib/swarm_sdk/ruby_llm_patches/responses_api_patch.rb', line 127 def render_payload(, tools:, temperature:, model:, stream: false, schema: nil, thinking: nil) system_msgs, other_msgs = () payload = build_base_payload(model, stream) add_instructions(payload, system_msgs) add_input(payload, other_msgs) add_temperature(payload, temperature) add_tools(payload, tools) add_schema(payload, schema) add_optional_parameters(payload) (payload, stream) payload end |
#tool_for(tool) ⇒ Object
Override tool_for for flat format (not nested under ‘function’)
166 167 168 169 170 171 172 173 174 175 176 177 178 179 |
# File 'lib/swarm_sdk/ruby_llm_patches/responses_api_patch.rb', line 166 def tool_for(tool) parameters_schema = parameters_schema_for(tool) definition = { type: "function", name: tool.name, description: tool.description, parameters: parameters_schema, } return definition if tool.provider_params.empty? RubyLLM::Utils.deep_merge(definition, tool.provider_params) end |