Class: OpenAI::Resources::Completions
- Inherits:
-
Object
- Object
- OpenAI::Resources::Completions
- Defined in:
- lib/openai/resources/completions.rb
Instance Method Summary collapse
-
#create(model: , prompt: , best_of: nil, echo: nil, frequency_penalty: nil, logit_bias: nil, logprobs: nil, max_tokens: nil, n: nil, presence_penalty: nil, seed: nil, stop: nil, stream_options: nil, suffix: nil, temperature: nil, top_p: nil, user: nil, request_options: {}) ⇒ OpenAI::Models::Completion
See #create_streaming for streaming counterpart.
-
#create_streaming(model: , prompt: , best_of: nil, echo: nil, frequency_penalty: nil, logit_bias: nil, logprobs: nil, max_tokens: nil, n: nil, presence_penalty: nil, seed: nil, stop: nil, stream_options: nil, suffix: nil, temperature: nil, top_p: nil, user: nil, request_options: {}) ⇒ OpenAI::Internal::Stream<OpenAI::Models::Completion>
See #create for non-streaming counterpart.
-
#initialize(client:) ⇒ Completions
constructor
private
A new instance of Completions.
Constructor Details
#initialize(client:) ⇒ Completions
This method is part of a private API. You should avoid using this method if possible, as it may be removed or be changed in the future.
Returns a new instance of Completions.
138 139 140 |
# File 'lib/openai/resources/completions.rb', line 138 def initialize(client:) @client = client end |
Instance Method Details
#create(model: , prompt: , best_of: nil, echo: nil, frequency_penalty: nil, logit_bias: nil, logprobs: nil, max_tokens: nil, n: nil, presence_penalty: nil, seed: nil, stop: nil, stream_options: nil, suffix: nil, temperature: nil, top_p: nil, user: nil, request_options: {}) ⇒ OpenAI::Models::Completion
See #create_streaming for streaming counterpart.
Some parameter documentations has been truncated, see Models::CompletionCreateParams for more details.
Creates a completion for the provided prompt and parameters.
54 55 56 57 58 59 60 61 62 63 64 65 66 67 |
# File 'lib/openai/resources/completions.rb', line 54 def create(params) parsed, = OpenAI::CompletionCreateParams.dump_request(params) if parsed[:stream] = "Please use `#create_streaming` for the streaming use case." raise ArgumentError.new() end @client.request( method: :post, path: "completions", body: parsed, model: OpenAI::Completion, options: ) end |
#create_streaming(model: , prompt: , best_of: nil, echo: nil, frequency_penalty: nil, logit_bias: nil, logprobs: nil, max_tokens: nil, n: nil, presence_penalty: nil, seed: nil, stop: nil, stream_options: nil, suffix: nil, temperature: nil, top_p: nil, user: nil, request_options: {}) ⇒ OpenAI::Internal::Stream<OpenAI::Models::Completion>
See #create for non-streaming counterpart.
Some parameter documentations has been truncated, see Models::CompletionCreateParams for more details.
Creates a completion for the provided prompt and parameters.
117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 |
# File 'lib/openai/resources/completions.rb', line 117 def create_streaming(params) parsed, = OpenAI::CompletionCreateParams.dump_request(params) unless parsed.fetch(:stream, true) = "Please use `#create` for the non-streaming use case." raise ArgumentError.new() end parsed.store(:stream, true) @client.request( method: :post, path: "completions", headers: {"accept" => "text/event-stream"}, body: parsed, stream: OpenAI::Internal::Stream, model: OpenAI::Completion, options: ) end |