Class: Aws::BedrockAgent::Types::PromptModelInferenceConfiguration
- Inherits:
-
Struct
- Object
- Struct
- Aws::BedrockAgent::Types::PromptModelInferenceConfiguration
- Includes:
- Structure
- Defined in:
- lib/aws-sdk-bedrockagent/types.rb
Overview
Contains inference configurations related to model inference for a prompt. For more information, see [Inference parameters].
[1]: docs.aws.amazon.com/bedrock/latest/userguide/inference-parameters.html
Constant Summary collapse
- SENSITIVE =
[]
Instance Attribute Summary collapse
-
#max_tokens ⇒ Integer
The maximum number of tokens to return in the response.
-
#stop_sequences ⇒ Array<String>
A list of strings that define sequences after which the model will stop generating.
-
#temperature ⇒ Float
Controls the randomness of the response.
-
#top_k ⇒ Integer
The number of most-likely candidates that the model considers for the next token during generation.
-
#top_p ⇒ Float
The percentage of most-likely candidates that the model considers for the next token.
Instance Attribute Details
#max_tokens ⇒ Integer
The maximum number of tokens to return in the response.
6007 6008 6009 6010 6011 6012 6013 6014 6015 |
# File 'lib/aws-sdk-bedrockagent/types.rb', line 6007 class PromptModelInferenceConfiguration < Struct.new( :max_tokens, :stop_sequences, :temperature, :top_k, :top_p) SENSITIVE = [] include Aws::Structure end |
#stop_sequences ⇒ Array<String>
A list of strings that define sequences after which the model will stop generating.
6007 6008 6009 6010 6011 6012 6013 6014 6015 |
# File 'lib/aws-sdk-bedrockagent/types.rb', line 6007 class PromptModelInferenceConfiguration < Struct.new( :max_tokens, :stop_sequences, :temperature, :top_k, :top_p) SENSITIVE = [] include Aws::Structure end |
#temperature ⇒ Float
Controls the randomness of the response. Choose a lower value for more predictable outputs and a higher value for more surprising outputs.
6007 6008 6009 6010 6011 6012 6013 6014 6015 |
# File 'lib/aws-sdk-bedrockagent/types.rb', line 6007 class PromptModelInferenceConfiguration < Struct.new( :max_tokens, :stop_sequences, :temperature, :top_k, :top_p) SENSITIVE = [] include Aws::Structure end |
#top_k ⇒ Integer
The number of most-likely candidates that the model considers for the next token during generation.
6007 6008 6009 6010 6011 6012 6013 6014 6015 |
# File 'lib/aws-sdk-bedrockagent/types.rb', line 6007 class PromptModelInferenceConfiguration < Struct.new( :max_tokens, :stop_sequences, :temperature, :top_k, :top_p) SENSITIVE = [] include Aws::Structure end |
#top_p ⇒ Float
The percentage of most-likely candidates that the model considers for the next token.
6007 6008 6009 6010 6011 6012 6013 6014 6015 |
# File 'lib/aws-sdk-bedrockagent/types.rb', line 6007 class PromptModelInferenceConfiguration < Struct.new( :max_tokens, :stop_sequences, :temperature, :top_k, :top_p) SENSITIVE = [] include Aws::Structure end |