sklearn 에서 classification_report module을 사용하여 평가 지표를 한번에 볼 수 있음
#1. classification_report
from sklearn.metrics import classification_report
y_pred = model.predict(X_test_features)
print(classification_report(y_test, y_pred, target_names=['normal', 'abnormal']))
참고 : scikit-learn.org/stable/modules/generated/sklearn.metrics.classification_report.html
sklearn.metrics.classification_report — scikit-learn 0.24.0 documentation
scikit-learn.org
#2. 평가 지표 해석
Precision:- Accuracy of positive predictions.
Precision = TP/(TP + FP)
Recall:- Fraction of positives that were correctly identified.
Recall = TP/(TP+FN)
F1 score
F1 Score = 2*(Recall * Precision) / (Recall + Precision)
Accuracy : (TP+TN) / all
macro avg = (normal+abnormal) /2 * precision or recall or f1 score
weighted avg = normal/(normal+abnormal) * precision or recall or f1 score
Understanding a Classification Report For Your Machine Learning Model
The classification report visualizer displays the precision, recall, F1, and support scores for the model.
medium.com
macro average and weighted average meaning in classification_report
I use the "classification_report" from from sklearn.metrics import classification_report in order to evaluate the imbalanced binary classification Classification Report : precision
datascience.stackexchange.com