Confusion Matrix
Interpretation Aid
Reading the confusion matrix:
- Rows = actual outcomes, columns = predicted outcomes
- Diagonal cells (TN, TP) = correct predictions
- Off-diagonal cells (FP, FN) = errors
Classification metrics:
- Accuracy = overall % correct
- Sensitivity = % of actual 1s correctly identified
- Specificity = % of actual 0s correctly identified
- Precision = % of predicted 1s that are actually 1
- F1 Score = harmonic mean of precision and recall
Threshold adjustment:
Adjust the classification threshold to balance sensitivity vs. specificity based on your business priorities. A lower threshold catches more positives but increases false alarms.