Class: OpenAI::Resources::Chat::Completions
- Inherits:
-
Object
- Object
- OpenAI::Resources::Chat::Completions
- Defined in:
- lib/openai/resources/chat/completions.rb,
lib/openai/resources/chat/completions/messages.rb
Overview
Given a list of messages comprising a conversation, the model will return a response.
Defined Under Namespace
Classes: Messages
Instance Attribute Summary collapse
-
#messages ⇒ OpenAI::Resources::Chat::Completions::Messages
readonly
Given a list of messages comprising a conversation, the model will return a response.
Instance Method Summary collapse
- #build_tools_with_models(tools, tool_models) ⇒ Object
-
#create(messages:, model:, audio: nil, frequency_penalty: nil, function_call: nil, functions: nil, logit_bias: nil, logprobs: nil, max_completion_tokens: nil, max_tokens: nil, metadata: nil, modalities: nil, n: nil, parallel_tool_calls: nil, prediction: nil, presence_penalty: nil, prompt_cache_key: nil, prompt_cache_retention: nil, reasoning_effort: nil, response_format: nil, safety_identifier: nil, seed: nil, service_tier: nil, stop: nil, store: nil, stream_options: nil, temperature: nil, tool_choice: nil, tools: nil, top_logprobs: nil, top_p: nil, user: nil, verbosity: nil, web_search_options: nil, request_options: {}) ⇒ OpenAI::Models::Chat::ChatCompletion
See #stream_raw for streaming counterpart.
-
#delete(completion_id, request_options: {}) ⇒ OpenAI::Models::Chat::ChatCompletionDeleted
Delete a stored chat completion.
- #get_structured_output_models(parsed) ⇒ Object
-
#initialize(client:) ⇒ Completions
constructor
private
A new instance of Completions.
-
#list(after: nil, limit: nil, metadata: nil, model: nil, order: nil, request_options: {}) ⇒ OpenAI::Internal::CursorPage<OpenAI::Models::Chat::ChatCompletion>
Some parameter documentations has been truncated, see Models::Chat::CompletionListParams for more details.
-
#retrieve(completion_id, request_options: {}) ⇒ OpenAI::Models::Chat::ChatCompletion
Get a stored chat completion.
- #stream(params) ⇒ Object
-
#stream_raw(messages:, model:, audio: nil, frequency_penalty: nil, function_call: nil, functions: nil, logit_bias: nil, logprobs: nil, max_completion_tokens: nil, max_tokens: nil, metadata: nil, modalities: nil, n: nil, parallel_tool_calls: nil, prediction: nil, presence_penalty: nil, prompt_cache_key: nil, prompt_cache_retention: nil, reasoning_effort: nil, response_format: nil, safety_identifier: nil, seed: nil, service_tier: nil, stop: nil, store: nil, stream_options: nil, temperature: nil, tool_choice: nil, tools: nil, top_logprobs: nil, top_p: nil, user: nil, verbosity: nil, web_search_options: nil, request_options: {}) ⇒ OpenAI::Internal::Stream<OpenAI::Models::Chat::ChatCompletionChunk>
See #create for non-streaming counterpart.
-
#update(completion_id, metadata:, request_options: {}) ⇒ OpenAI::Models::Chat::ChatCompletion
Some parameter documentations has been truncated, see Models::Chat::CompletionUpdateParams for more details.
Constructor Details
#initialize(client:) ⇒ Completions
This method is part of a private API. You should avoid using this method if possible, as it may be removed or be changed in the future.
Returns a new instance of Completions.
493 494 495 496 |
# File 'lib/openai/resources/chat/completions.rb', line 493 def initialize(client:) @client = client @messages = OpenAI::Resources::Chat::Completions::Messages.new(client: client) end |
Instance Attribute Details
#messages ⇒ OpenAI::Resources::Chat::Completions::Messages (readonly)
Given a list of messages comprising a conversation, the model will return a response.
12 13 14 |
# File 'lib/openai/resources/chat/completions.rb', line 12 def @messages end |
Instance Method Details
#build_tools_with_models(tools, tool_models) ⇒ Object
225 226 227 228 229 230 231 232 233 234 235 236 |
# File 'lib/openai/resources/chat/completions.rb', line 225 def build_tools_with_models(tools, tool_models) return [] if tools.nil? tools.map do |tool| next tool unless tool[:type] == :function function_name = tool.dig(:function, :name) model = tool_models[function_name] model ? tool.merge(model: model) : tool end end |
#create(messages:, model:, audio: nil, frequency_penalty: nil, function_call: nil, functions: nil, logit_bias: nil, logprobs: nil, max_completion_tokens: nil, max_tokens: nil, metadata: nil, modalities: nil, n: nil, parallel_tool_calls: nil, prediction: nil, presence_penalty: nil, prompt_cache_key: nil, prompt_cache_retention: nil, reasoning_effort: nil, response_format: nil, safety_identifier: nil, seed: nil, service_tier: nil, stop: nil, store: nil, stream_options: nil, temperature: nil, tool_choice: nil, tools: nil, top_logprobs: nil, top_p: nil, user: nil, verbosity: nil, web_search_options: nil, request_options: {}) ⇒ OpenAI::Models::Chat::ChatCompletion
See #stream_raw for streaming counterpart.
Some parameter documentations has been truncated, see Models::Chat::CompletionCreateParams for more details.
**Starting a new project?** We recommend trying [Responses](platform.openai.com/docs/api-reference/responses) to take advantage of the latest OpenAI platform features. Compare [Chat Completions with Responses](platform.openai.com/docs/guides/responses-vs-chat-completions?api-mode=responses).
Creates a model response for the given chat conversation. Learn more in the [text generation](platform.openai.com/docs/guides/text-generation), [vision](platform.openai.com/docs/guides/vision), and [audio](platform.openai.com/docs/guides/audio) guides.
Parameter support can differ depending on the model used to generate the response, particularly for newer reasoning models. Parameters that are only supported for reasoning models are noted below. For the current state of unsupported parameters in reasoning models, [refer to the reasoning guide](platform.openai.com/docs/guides/reasoning).
Returns a chat completion object, or a streamed sequence of chat completion chunk objects if the request is streamed.
115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 |
# File 'lib/openai/resources/chat/completions.rb', line 115 def create(params) parsed, = OpenAI::Chat::CompletionCreateParams.dump_request(params) if parsed[:stream] = "Please use `#stream_raw` for the streaming use case." raise ArgumentError.new() end model, tool_models = get_structured_output_models(parsed) # rubocop:disable Metrics/BlockLength unwrap = ->(raw) do if model.is_a?(OpenAI::StructuredOutput::JsonSchemaConverter) raw[:choices]&.each do |choice| = choice.fetch(:message) begin content = .fetch(:content) parsed = content.nil? ? nil : JSON.parse(content, symbolize_names: true) rescue JSON::ParserError => e parsed = e end coerced = OpenAI::Internal::Type::Converter.coerce(model, parsed) .store(:parsed, coerced) end end raw[:choices]&.each do |choice| choice.dig(:message, :tool_calls)&.each do |tool_call| func = tool_call.fetch(:function) next if (model = tool_models[func.fetch(:name)]).nil? begin arguments = func.fetch(:arguments) parsed = arguments.nil? ? nil : JSON.parse(arguments, symbolize_names: true) rescue JSON::ParserError => e parsed = e end coerced = OpenAI::Internal::Type::Converter.coerce(model, parsed) func.store(:parsed, coerced) end end raw end # rubocop:enable Metrics/BlockLength @client.request( method: :post, path: "chat/completions", body: parsed, unwrap: unwrap, model: OpenAI::Chat::ChatCompletion, options: ) end |
#delete(completion_id, request_options: {}) ⇒ OpenAI::Models::Chat::ChatCompletionDeleted
Delete a stored chat completion. Only Chat Completions that have been created with the ‘store` parameter set to `true` can be deleted.
481 482 483 484 485 486 487 488 |
# File 'lib/openai/resources/chat/completions.rb', line 481 def delete(completion_id, params = {}) @client.request( method: :delete, path: ["chat/completions/%1$s", completion_id], model: OpenAI::Chat::ChatCompletionDeleted, options: params[:request_options] ) end |
#get_structured_output_models(parsed) ⇒ Object
169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 |
# File 'lib/openai/resources/chat/completions.rb', line 169 def get_structured_output_models(parsed) model = nil tool_models = {} case parsed in {response_format: OpenAI::StructuredOutput::JsonSchemaConverter => model} parsed.update( response_format: { type: :json_schema, json_schema: { strict: true, name: model.name.split("::").last, schema: model.to_json_schema } } ) in {response_format: {type: :json_schema, json_schema: OpenAI::StructuredOutput::JsonSchemaConverter => model}} parsed.fetch(:response_format).update( json_schema: { strict: true, name: model.name.split("::").last, schema: model.to_json_schema } ) in {response_format: {type: :json_schema, json_schema: {schema: OpenAI::StructuredOutput::JsonSchemaConverter => model}}} parsed.dig(:response_format, :json_schema).store(:schema, model.to_json_schema) in {tools: Array => tools} mapped = tools.map do |tool| case tool in OpenAI::StructuredOutput::JsonSchemaConverter name = tool.name.split("::").last tool_models.store(name, tool) { type: :function, function: { strict: true, name: name, parameters: tool.to_json_schema } } in {function: {parameters: OpenAI::StructuredOutput::JsonSchemaConverter => params}} func = tool.fetch(:function) name = func[:name] ||= params.name.split("::").last tool_models.store(name, params) func.update(parameters: params.to_json_schema) tool else tool end end tools.replace(mapped) else end [model, tool_models] end |
#list(after: nil, limit: nil, metadata: nil, model: nil, order: nil, request_options: {}) ⇒ OpenAI::Internal::CursorPage<OpenAI::Models::Chat::ChatCompletion>
Some parameter documentations has been truncated, see Models::Chat::CompletionListParams for more details.
List stored Chat Completions. Only Chat Completions that have been stored with the ‘store` parameter set to `true` will be returned.
456 457 458 459 460 461 462 463 464 465 466 467 |
# File 'lib/openai/resources/chat/completions.rb', line 456 def list(params = {}) parsed, = OpenAI::Chat::CompletionListParams.dump_request(params) query = OpenAI::Internal::Util.encode_query_params(parsed) @client.request( method: :get, path: "chat/completions", query: query, page: OpenAI::Internal::CursorPage, model: OpenAI::Chat::ChatCompletion, options: ) end |
#retrieve(completion_id, request_options: {}) ⇒ OpenAI::Models::Chat::ChatCompletion
Get a stored chat completion. Only Chat Completions that have been created with the ‘store` parameter set to `true` will be returned.
395 396 397 398 399 400 401 402 |
# File 'lib/openai/resources/chat/completions.rb', line 395 def retrieve(completion_id, params = {}) @client.request( method: :get, path: ["chat/completions/%1$s", completion_id], model: OpenAI::Chat::ChatCompletion, options: params[:request_options] ) end |
#stream(params) ⇒ Object
238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 |
# File 'lib/openai/resources/chat/completions.rb', line 238 def stream(params) parsed, = OpenAI::Chat::CompletionCreateParams.dump_request(params) parsed.store(:stream, true) response_format, tool_models = get_structured_output_models(parsed) input_tools = build_tools_with_models(parsed[:tools], tool_models) raw_stream = @client.request( method: :post, path: "chat/completions", headers: {"accept" => "text/event-stream"}, body: parsed, stream: OpenAI::Internal::Stream, model: OpenAI::Chat::ChatCompletionChunk, options: ) OpenAI::Helpers::Streaming::ChatCompletionStream.new( raw_stream: raw_stream, response_format: response_format, input_tools: input_tools ) end |
#stream_raw(messages:, model:, audio: nil, frequency_penalty: nil, function_call: nil, functions: nil, logit_bias: nil, logprobs: nil, max_completion_tokens: nil, max_tokens: nil, metadata: nil, modalities: nil, n: nil, parallel_tool_calls: nil, prediction: nil, presence_penalty: nil, prompt_cache_key: nil, prompt_cache_retention: nil, reasoning_effort: nil, response_format: nil, safety_identifier: nil, seed: nil, service_tier: nil, stop: nil, store: nil, stream_options: nil, temperature: nil, tool_choice: nil, tools: nil, top_logprobs: nil, top_p: nil, user: nil, verbosity: nil, web_search_options: nil, request_options: {}) ⇒ OpenAI::Internal::Stream<OpenAI::Models::Chat::ChatCompletionChunk>
See #create for non-streaming counterpart.
Some parameter documentations has been truncated, see Models::Chat::CompletionCreateParams for more details.
**Starting a new project?** We recommend trying [Responses](platform.openai.com/docs/api-reference/responses) to take advantage of the latest OpenAI platform features. Compare [Chat Completions with Responses](platform.openai.com/docs/guides/responses-vs-chat-completions?api-mode=responses).
Creates a model response for the given chat conversation. Learn more in the [text generation](platform.openai.com/docs/guides/text-generation), [vision](platform.openai.com/docs/guides/vision), and [audio](platform.openai.com/docs/guides/audio) guides.
Parameter support can differ depending on the model used to generate the response, particularly for newer reasoning models. Parameters that are only supported for reasoning models are noted below. For the current state of unsupported parameters in reasoning models, [refer to the reasoning guide](platform.openai.com/docs/guides/reasoning).
Returns a chat completion object, or a streamed sequence of chat completion chunk objects if the request is streamed.
365 366 367 368 369 370 371 372 373 374 375 376 377 378 379 380 381 |
# File 'lib/openai/resources/chat/completions.rb', line 365 def stream_raw(params) parsed, = OpenAI::Chat::CompletionCreateParams.dump_request(params) unless parsed.fetch(:stream, true) = "Please use `#create` for the non-streaming use case." raise ArgumentError.new() end parsed.store(:stream, true) @client.request( method: :post, path: "chat/completions", headers: {"accept" => "text/event-stream"}, body: parsed, stream: OpenAI::Internal::Stream, model: OpenAI::Chat::ChatCompletionChunk, options: ) end |
#update(completion_id, metadata:, request_options: {}) ⇒ OpenAI::Models::Chat::ChatCompletion
Some parameter documentations has been truncated, see Models::Chat::CompletionUpdateParams for more details.
Modify a stored chat completion. Only Chat Completions that have been created with the ‘store` parameter set to `true` can be modified. Currently, the only supported modification is to update the `metadata` field.
422 423 424 425 426 427 428 429 430 431 |
# File 'lib/openai/resources/chat/completions.rb', line 422 def update(completion_id, params) parsed, = OpenAI::Chat::CompletionUpdateParams.dump_request(params) @client.request( method: :post, path: ["chat/completions/%1$s", completion_id], body: parsed, model: OpenAI::Chat::ChatCompletion, options: ) end |