Skip to content

kovalp/binary-classification-ratios

Repository files navigation

Binary classification ratios

logo-binary-classification-ratios

The package helps computing the quality metrics (ratios) arising in the binary classification. The binary classification is given by the confusion matrix. The confusion matrix is given by number of true positives (TP), true negatives (TN), false positives (FP) and false negatives (FN) generated by an algorithm or procedure.

The ratios are easier to interpret than the confusion matrix.

For example, the Accuracy is given by the ratio of all accurate responses to the total number of responses

$$ \mathrm{Accuracy} = \frac{\mathrm{TP} + \mathrm{TN}}{\mathrm{TP} + \mathrm{TN} + \mathrm{FP} + \mathrm{FN}}. $$

Another popular ratio is Recall

$$ \mathrm{Recall} = \frac{\mathrm{TP}}{\mathrm{TP} + \mathrm{FN}}. $$

Usage

There is a command-line utility binary-classification-ratios. The utility takes the optional arguments  -tp,  -tn,  -fp, and  -fn, computes the popular binary-classification ratios such as Accuracy, Recall, Precision and F1-score and prints them to terminal.

binary-classification-ratios -tp 10 -tn 20 -fp 30 -fn 40
Confusion matrix TP 10 TN 20 FP 30 FN 40
     accuracy 0.30000
    precision 0.250
       recall 0.200
     f1-score 0.222

The package is designed to be useful in other Python projects where the elements of the confusion matrix are known

from binary_classification_ratios import BinaryClassificationRatios

ratios = BinaryClassificationRatios(tp=10, tn=20, fp=30, fn=40)
print(ratios.get_summary())

ratios.assert_min(0.9, 0.8, 0.7)

Install

pip install binary-classification-ratios

About

Binary classification ratios gathered in one package.

Topics

Resources

Stars

Watchers

Forks

Packages

No packages published

Languages