Class: OpenAI::Models::Chat::ChatCompletionChunk::Choice
- Inherits:
-
Internal::Type::BaseModel
- Object
- Internal::Type::BaseModel
- OpenAI::Models::Chat::ChatCompletionChunk::Choice
- Defined in:
- lib/openai/models/chat/chat_completion_chunk.rb
Defined Under Namespace
Modules: FinishReason Classes: Delta, Logprobs
Instance Attribute Summary collapse
-
#delta ⇒ OpenAI::Models::Chat::ChatCompletionChunk::Choice::Delta
A chat completion delta generated by streamed model responses.
-
#finish_reason ⇒ Symbol, ...
The reason the model stopped generating tokens.
-
#index ⇒ Integer
The index of the choice in the list of choices.
-
#logprobs ⇒ OpenAI::Models::Chat::ChatCompletionChunk::Choice::Logprobs?
Log probability information for the choice.
Class Method Summary collapse
Instance Method Summary collapse
-
#initialize(delta: , finish_reason: , index: , logprobs: nil) ⇒ Object
constructor
Some parameter documentations has been truncated, see Choice for more details.
Methods inherited from Internal::Type::BaseModel
==, #==, #[], coerce, #deconstruct_keys, #deep_to_h, dump, fields, hash, #hash, inherited, inspect, #inspect, known_fields, optional, recursively_to_h, required, #to_h, #to_json, #to_s, to_sorbet_type, #to_yaml
Methods included from Internal::Type::Converter
#coerce, coerce, #dump, dump, #inspect, inspect, type_info
Methods included from Internal::Util::SorbetRuntimeSupport
#const_missing, #define_sorbet_constant!, #sorbet_constant_defined?, #to_sorbet_type, to_sorbet_type
Constructor Details
#initialize(delta: , finish_reason: , index: , logprobs: nil) ⇒ Object
Some parameter documentations has been truncated, see OpenAI::Models::Chat::ChatCompletionChunk::Choice for more details.
106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 368 369 370 371 372 |
# File 'lib/openai/models/chat/chat_completion_chunk.rb', line 106 class Choice < OpenAI::Internal::Type::BaseModel # @!attribute delta # A chat completion delta generated by streamed model responses. # # @return [OpenAI::Models::Chat::ChatCompletionChunk::Choice::Delta] required :delta, -> { OpenAI::Chat::ChatCompletionChunk::Choice::Delta } # @!attribute finish_reason # The reason the model stopped generating tokens. This will be `stop` if the model # hit a natural stop point or a provided stop sequence, `length` if the maximum # number of tokens specified in the request was reached, `content_filter` if # content was omitted due to a flag from our content filters, `tool_calls` if the # model called a tool, or `function_call` (deprecated) if the model called a # function. # # @return [Symbol, OpenAI::Models::Chat::ChatCompletionChunk::Choice::FinishReason, nil] required :finish_reason, enum: -> { OpenAI::Chat::ChatCompletionChunk::Choice::FinishReason }, nil?: true # @!attribute index # The index of the choice in the list of choices. # # @return [Integer] required :index, Integer # @!attribute logprobs # Log probability information for the choice. # # @return [OpenAI::Models::Chat::ChatCompletionChunk::Choice::Logprobs, nil] optional :logprobs, -> { OpenAI::Chat::ChatCompletionChunk::Choice::Logprobs }, nil?: true # @!method initialize(delta:, finish_reason:, index:, logprobs: nil) # Some parameter documentations has been truncated, see # {OpenAI::Models::Chat::ChatCompletionChunk::Choice} for more details. # # @param delta [OpenAI::Models::Chat::ChatCompletionChunk::Choice::Delta] A chat completion delta generated by streamed model responses. # # @param finish_reason [Symbol, OpenAI::Models::Chat::ChatCompletionChunk::Choice::FinishReason, nil] The reason the model stopped generating tokens. This will be `stop` if the model # # @param index [Integer] The index of the choice in the list of choices. # # @param logprobs [OpenAI::Models::Chat::ChatCompletionChunk::Choice::Logprobs, nil] Log probability information for the choice. # @see OpenAI::Models::Chat::ChatCompletionChunk::Choice#delta class Delta < OpenAI::Internal::Type::BaseModel # @!attribute content # The contents of the chunk message. # # @return [String, nil] optional :content, String, nil?: true # @!attribute function_call # @deprecated # # Deprecated and replaced by `tool_calls`. The name and arguments of a function # that should be called, as generated by the model. # # @return [OpenAI::Models::Chat::ChatCompletionChunk::Choice::Delta::FunctionCall, nil] optional :function_call, -> { OpenAI::Chat::ChatCompletionChunk::Choice::Delta::FunctionCall } # @!attribute refusal # The refusal message generated by the model. # # @return [String, nil] optional :refusal, String, nil?: true # @!attribute role # The role of the author of this message. # # @return [Symbol, OpenAI::Models::Chat::ChatCompletionChunk::Choice::Delta::Role, nil] optional :role, enum: -> { OpenAI::Chat::ChatCompletionChunk::Choice::Delta::Role } # @!attribute tool_calls # # @return [Array<OpenAI::Models::Chat::ChatCompletionChunk::Choice::Delta::ToolCall>, nil] optional :tool_calls, -> { OpenAI::Internal::Type::ArrayOf[OpenAI::Chat::ChatCompletionChunk::Choice::Delta::ToolCall] } # @!method initialize(content: nil, function_call: nil, refusal: nil, role: nil, tool_calls: nil) # Some parameter documentations has been truncated, see # {OpenAI::Models::Chat::ChatCompletionChunk::Choice::Delta} for more details. # # A chat completion delta generated by streamed model responses. # # @param content [String, nil] The contents of the chunk message. # # @param function_call [OpenAI::Models::Chat::ChatCompletionChunk::Choice::Delta::FunctionCall] Deprecated and replaced by `tool_calls`. The name and arguments of a function th # # @param refusal [String, nil] The refusal message generated by the model. # # @param role [Symbol, OpenAI::Models::Chat::ChatCompletionChunk::Choice::Delta::Role] The role of the author of this message. # # @param tool_calls [Array<OpenAI::Models::Chat::ChatCompletionChunk::Choice::Delta::ToolCall>] # @deprecated # # @see OpenAI::Models::Chat::ChatCompletionChunk::Choice::Delta#function_call class FunctionCall < OpenAI::Internal::Type::BaseModel # @!attribute arguments # The arguments to call the function with, as generated by the model in JSON # format. Note that the model does not always generate valid JSON, and may # hallucinate parameters not defined by your function schema. Validate the # arguments in your code before calling your function. # # @return [String, nil] optional :arguments, String # @!attribute name # The name of the function to call. # # @return [String, nil] optional :name, String # @!method initialize(arguments: nil, name: nil) # Some parameter documentations has been truncated, see # {OpenAI::Models::Chat::ChatCompletionChunk::Choice::Delta::FunctionCall} for # more details. # # Deprecated and replaced by `tool_calls`. The name and arguments of a function # that should be called, as generated by the model. # # @param arguments [String] The arguments to call the function with, as generated by the model in JSON forma # # @param name [String] The name of the function to call. end # The role of the author of this message. # # @see OpenAI::Models::Chat::ChatCompletionChunk::Choice::Delta#role module Role extend OpenAI::Internal::Type::Enum DEVELOPER = :developer SYSTEM = :system USER = :user ASSISTANT = :assistant TOOL = :tool # @!method self.values # @return [Array<Symbol>] end class ToolCall < OpenAI::Internal::Type::BaseModel # @!attribute index # # @return [Integer] required :index, Integer # @!attribute id # The ID of the tool call. # # @return [String, nil] optional :id, String # @!attribute function # # @return [OpenAI::Models::Chat::ChatCompletionChunk::Choice::Delta::ToolCall::Function, nil] optional :function, -> { OpenAI::Chat::ChatCompletionChunk::Choice::Delta::ToolCall::Function } # @!attribute type # The type of the tool. Currently, only `function` is supported. # # @return [Symbol, OpenAI::Models::Chat::ChatCompletionChunk::Choice::Delta::ToolCall::Type, nil] optional :type, enum: -> { OpenAI::Chat::ChatCompletionChunk::Choice::Delta::ToolCall::Type } # @!method initialize(index:, id: nil, function: nil, type: nil) # @param index [Integer] # # @param id [String] The ID of the tool call. # # @param function [OpenAI::Models::Chat::ChatCompletionChunk::Choice::Delta::ToolCall::Function] # # @param type [Symbol, OpenAI::Models::Chat::ChatCompletionChunk::Choice::Delta::ToolCall::Type] The type of the tool. Currently, only `function` is supported. # @see OpenAI::Models::Chat::ChatCompletionChunk::Choice::Delta::ToolCall#function class Function < OpenAI::Internal::Type::BaseModel # @!attribute arguments # The arguments to call the function with, as generated by the model in JSON # format. Note that the model does not always generate valid JSON, and may # hallucinate parameters not defined by your function schema. Validate the # arguments in your code before calling your function. # # @return [String, nil] optional :arguments, String # @!attribute name # The name of the function to call. # # @return [String, nil] optional :name, String # @!method initialize(arguments: nil, name: nil) # Some parameter documentations has been truncated, see # {OpenAI::Models::Chat::ChatCompletionChunk::Choice::Delta::ToolCall::Function} # for more details. # # @param arguments [String] The arguments to call the function with, as generated by the model in JSON forma # # @param name [String] The name of the function to call. end # The type of the tool. Currently, only `function` is supported. # # @see OpenAI::Models::Chat::ChatCompletionChunk::Choice::Delta::ToolCall#type module Type extend OpenAI::Internal::Type::Enum FUNCTION = :function # @!method self.values # @return [Array<Symbol>] end end end # The reason the model stopped generating tokens. This will be `stop` if the model # hit a natural stop point or a provided stop sequence, `length` if the maximum # number of tokens specified in the request was reached, `content_filter` if # content was omitted due to a flag from our content filters, `tool_calls` if the # model called a tool, or `function_call` (deprecated) if the model called a # function. # # @see OpenAI::Models::Chat::ChatCompletionChunk::Choice#finish_reason module FinishReason extend OpenAI::Internal::Type::Enum STOP = :stop LENGTH = :length TOOL_CALLS = :tool_calls CONTENT_FILTER = :content_filter FUNCTION_CALL = :function_call # @!method self.values # @return [Array<Symbol>] end # @see OpenAI::Models::Chat::ChatCompletionChunk::Choice#logprobs class Logprobs < OpenAI::Internal::Type::BaseModel # @!attribute content # A list of message content tokens with log probability information. # # @return [Array<OpenAI::Models::Chat::ChatCompletionTokenLogprob>, nil] required :content, -> { OpenAI::Internal::Type::ArrayOf[OpenAI::Chat::ChatCompletionTokenLogprob] }, nil?: true # @!attribute refusal # A list of message refusal tokens with log probability information. # # @return [Array<OpenAI::Models::Chat::ChatCompletionTokenLogprob>, nil] required :refusal, -> { OpenAI::Internal::Type::ArrayOf[OpenAI::Chat::ChatCompletionTokenLogprob] }, nil?: true # @!method initialize(content:, refusal:) # Log probability information for the choice. # # @param content [Array<OpenAI::Models::Chat::ChatCompletionTokenLogprob>, nil] A list of message content tokens with log probability information. # # @param refusal [Array<OpenAI::Models::Chat::ChatCompletionTokenLogprob>, nil] A list of message refusal tokens with log probability information. end end |
Instance Attribute Details
#delta ⇒ OpenAI::Models::Chat::ChatCompletionChunk::Choice::Delta
A chat completion delta generated by streamed model responses.
111 |
# File 'lib/openai/models/chat/chat_completion_chunk.rb', line 111 required :delta, -> { OpenAI::Chat::ChatCompletionChunk::Choice::Delta } |
#finish_reason ⇒ Symbol, ...
The reason the model stopped generating tokens. This will be ‘stop` if the model hit a natural stop point or a provided stop sequence, `length` if the maximum number of tokens specified in the request was reached, `content_filter` if content was omitted due to a flag from our content filters, `tool_calls` if the model called a tool, or `function_call` (deprecated) if the model called a function.
122 123 124 125 126 |
# File 'lib/openai/models/chat/chat_completion_chunk.rb', line 122 required :finish_reason, enum: -> { OpenAI::Chat::ChatCompletionChunk::Choice::FinishReason }, nil?: true |
#index ⇒ Integer
The index of the choice in the list of choices.
132 |
# File 'lib/openai/models/chat/chat_completion_chunk.rb', line 132 required :index, Integer |
#logprobs ⇒ OpenAI::Models::Chat::ChatCompletionChunk::Choice::Logprobs?
Log probability information for the choice.
138 |
# File 'lib/openai/models/chat/chat_completion_chunk.rb', line 138 optional :logprobs, -> { OpenAI::Chat::ChatCompletionChunk::Choice::Logprobs }, nil?: true |
Class Method Details
.values ⇒ Array<Symbol>
|
# File 'lib/openai/models/chat/chat_completion_chunk.rb', line 320
|