Class | Description |
---|---|
ConfusionMatrix<T extends Comparable<? super T>> | |
Evaluation |
Evaluation metrics:
- precision, recall, f1, fBeta, accuracy, Matthews correlation coefficient, gMeasure - Top N accuracy (if using constructor Evaluation.Evaluation(List, int) )- Custom binary evaluation decision threshold (use constructor Evaluation.Evaluation(double) (default if not set is
argmax / 0.5)- Custom cost array, using Evaluation.Evaluation(INDArray) or Evaluation.Evaluation(List, INDArray) for multi-class Note: Care should be taken when using the Evaluation class for binary classification metrics such as F1, precision, recall, etc. |
EvaluationBinary |
EvaluationBinary: used for evaluating networks with binary classification outputs.
|
EvaluationCalibration |
EvaluationCalibration is an evaluation class designed to analyze the calibration of a classifier.
|
ROC |
ROC (Receiver Operating Characteristic) for binary classifiers.
|
ROC.CountsForThreshold | |
ROCBinary |
ROC (Receiver Operating Characteristic) for multi-task binary classifiers.
|
ROCMultiClass |
ROC (Receiver Operating Characteristic) for multi-class classifiers.
|
Enum | Description |
---|---|
Evaluation.Metric | |
EvaluationBinary.Metric | |
ROC.Metric |
AUROC: Area under ROC curve
AUPRC: Area under Precision-Recall Curve |
ROCBinary.Metric |
AUROC: Area under ROC curve
AUPRC: Area under Precision-Recall Curve |
ROCMultiClass.Metric |
AUROC: Area under ROC curve
AUPRC: Area under Precision-Recall Curve |
Copyright © 2020. All rights reserved.