Class: Aws::Glue::Types::FindMatchesMetrics
- Inherits:
-
Struct
- Object
- Struct
- Aws::Glue::Types::FindMatchesMetrics
- Includes:
- Structure
- Defined in:
- lib/aws-sdk-glue/types.rb
Overview
The evaluation metrics for the find matches algorithm. The quality of your machine learning transform is measured by getting your transform to predict some matches and comparing the results to known matches from the same dataset. The quality metrics are based on a subset of your data, so they are not precise.
Constant Summary collapse
- SENSITIVE =
[]
Instance Attribute Summary collapse
-
#area_under_pr_curve ⇒ Float
The area under the precision/recall curve (AUPRC) is a single number measuring the overall quality of the transform, that is independent of the choice made for precision vs.
-
#column_importances ⇒ Array<Types::ColumnImportance>
A list of ‘ColumnImportance` structures containing column importance metrics, sorted in order of descending importance.
-
#confusion_matrix ⇒ Types::ConfusionMatrix
The confusion matrix shows you what your transform is predicting accurately and what types of errors it is making.
-
#f1 ⇒ Float
The maximum F1 metric indicates the transform’s accuracy between 0 and 1, where 1 is the best accuracy.
-
#precision ⇒ Float
The precision metric indicates when often your transform is correct when it predicts a match.
-
#recall ⇒ Float
The recall metric indicates that for an actual match, how often your transform predicts the match.
Instance Attribute Details
#area_under_pr_curve ⇒ Float
The area under the precision/recall curve (AUPRC) is a single number measuring the overall quality of the transform, that is independent of the choice made for precision vs. recall. Higher values indicate that you have a more attractive precision vs. recall tradeoff.
For more information, see [Precision and recall] in Wikipedia.
9294 9295 9296 9297 9298 9299 9300 9301 9302 9303 |
# File 'lib/aws-sdk-glue/types.rb', line 9294 class FindMatchesMetrics < Struct.new( :area_under_pr_curve, :precision, :recall, :f1, :confusion_matrix, :column_importances) SENSITIVE = [] include Aws::Structure end |
#column_importances ⇒ Array<Types::ColumnImportance>
A list of ‘ColumnImportance` structures containing column importance metrics, sorted in order of descending importance.
9294 9295 9296 9297 9298 9299 9300 9301 9302 9303 |
# File 'lib/aws-sdk-glue/types.rb', line 9294 class FindMatchesMetrics < Struct.new( :area_under_pr_curve, :precision, :recall, :f1, :confusion_matrix, :column_importances) SENSITIVE = [] include Aws::Structure end |
#confusion_matrix ⇒ Types::ConfusionMatrix
The confusion matrix shows you what your transform is predicting accurately and what types of errors it is making.
For more information, see [Confusion matrix] in Wikipedia.
9294 9295 9296 9297 9298 9299 9300 9301 9302 9303 |
# File 'lib/aws-sdk-glue/types.rb', line 9294 class FindMatchesMetrics < Struct.new( :area_under_pr_curve, :precision, :recall, :f1, :confusion_matrix, :column_importances) SENSITIVE = [] include Aws::Structure end |
#f1 ⇒ Float
The maximum F1 metric indicates the transform’s accuracy between 0 and 1, where 1 is the best accuracy.
For more information, see [F1 score] in Wikipedia.
9294 9295 9296 9297 9298 9299 9300 9301 9302 9303 |
# File 'lib/aws-sdk-glue/types.rb', line 9294 class FindMatchesMetrics < Struct.new( :area_under_pr_curve, :precision, :recall, :f1, :confusion_matrix, :column_importances) SENSITIVE = [] include Aws::Structure end |
#precision ⇒ Float
The precision metric indicates when often your transform is correct when it predicts a match. Specifically, it measures how well the transform finds true positives from the total true positives possible.
For more information, see [Precision and recall] in Wikipedia.
9294 9295 9296 9297 9298 9299 9300 9301 9302 9303 |
# File 'lib/aws-sdk-glue/types.rb', line 9294 class FindMatchesMetrics < Struct.new( :area_under_pr_curve, :precision, :recall, :f1, :confusion_matrix, :column_importances) SENSITIVE = [] include Aws::Structure end |
#recall ⇒ Float
The recall metric indicates that for an actual match, how often your transform predicts the match. Specifically, it measures how well the transform finds true positives from the total records in the source data.
For more information, see [Precision and recall] in Wikipedia.
9294 9295 9296 9297 9298 9299 9300 9301 9302 9303 |
# File 'lib/aws-sdk-glue/types.rb', line 9294 class FindMatchesMetrics < Struct.new( :area_under_pr_curve, :precision, :recall, :f1, :confusion_matrix, :column_importances) SENSITIVE = [] include Aws::Structure end |