Enter the values from your confusion matrix to evaluate model performance.
Provide the True Positive (TP), False Positive (FP), True Negative (TN), and False Negative (FN) values from your model's confusion matrix. The calculator will compute the MCC and other key performance metrics.
Explore different scenarios to understand how MCC works.
A scenario where the model performs well on a balanced dataset.
TP: 90, FP: 10
TN: 85, FN: 15
An example with an imbalanced dataset, where MCC is particularly useful.
TP: 95, FP: 5
TN: 9900, FN: 0
A model that performs poorly, close to a random guess.
TP: 50, FP: 50
TN: 50, FN: 50
A perfect model with no errors, resulting in the highest possible MCC score.
TP: 100, FP: 0
TN: 100, FN: 0
TP * TN - FP * FN
, essentially measures the covariance between the predicted and actual values. A large positive value means the predictions align well with reality. The denominator is a normalization factor, scaling the result between -1 and +1. It is the geometric mean of the four sums of rows and columns of the confusion matrix.TN + FP = 0
). In this situation, the denominator becomes zero, which would lead to a division-by-zero error. By convention, the MCC is defined as 0 in such cases, reflecting that the model has no predictive power.