Class: Aws::BedrockAgent::Types::PromptModelInferenceConfiguration

Inherits:
Struct
  • Object
show all
Includes:
Structure
Defined in:
lib/aws-sdk-bedrockagent/types.rb

Overview

Contains inference configurations related to model inference for a prompt. For more information, see [Inference parameters].

[1]: docs.aws.amazon.com/bedrock/latest/userguide/inference-parameters.html

Constant Summary collapse

SENSITIVE =
[]

Instance Attribute Summary collapse

Instance Attribute Details

#max_tokensInteger

The maximum number of tokens to return in the response.

Returns:

  • (Integer)


6067
6068
6069
6070
6071
6072
6073
6074
6075
# File 'lib/aws-sdk-bedrockagent/types.rb', line 6067

class PromptModelInferenceConfiguration < Struct.new(
  :max_tokens,
  :stop_sequences,
  :temperature,
  :top_k,
  :top_p)
  SENSITIVE = []
  include Aws::Structure
end

#stop_sequencesArray<String>

A list of strings that define sequences after which the model will stop generating.

Returns:

  • (Array<String>)


6067
6068
6069
6070
6071
6072
6073
6074
6075
# File 'lib/aws-sdk-bedrockagent/types.rb', line 6067

class PromptModelInferenceConfiguration < Struct.new(
  :max_tokens,
  :stop_sequences,
  :temperature,
  :top_k,
  :top_p)
  SENSITIVE = []
  include Aws::Structure
end

#temperatureFloat

Controls the randomness of the response. Choose a lower value for more predictable outputs and a higher value for more surprising outputs.

Returns:

  • (Float)


6067
6068
6069
6070
6071
6072
6073
6074
6075
# File 'lib/aws-sdk-bedrockagent/types.rb', line 6067

class PromptModelInferenceConfiguration < Struct.new(
  :max_tokens,
  :stop_sequences,
  :temperature,
  :top_k,
  :top_p)
  SENSITIVE = []
  include Aws::Structure
end

#top_kInteger

The number of most-likely candidates that the model considers for the next token during generation.

Returns:

  • (Integer)


6067
6068
6069
6070
6071
6072
6073
6074
6075
# File 'lib/aws-sdk-bedrockagent/types.rb', line 6067

class PromptModelInferenceConfiguration < Struct.new(
  :max_tokens,
  :stop_sequences,
  :temperature,
  :top_k,
  :top_p)
  SENSITIVE = []
  include Aws::Structure
end

#top_pFloat

The percentage of most-likely candidates that the model considers for the next token.

Returns:

  • (Float)


6067
6068
6069
6070
6071
6072
6073
6074
6075
# File 'lib/aws-sdk-bedrockagent/types.rb', line 6067

class PromptModelInferenceConfiguration < Struct.new(
  :max_tokens,
  :stop_sequences,
  :temperature,
  :top_k,
  :top_p)
  SENSITIVE = []
  include Aws::Structure
end