The package helps computing the quality metrics (ratios) arising in the binary classification. The binary classification is given by the confusion matrix. The confusion matrix is given by number of true positives (TP), true negatives (TN), false positives (FP) and false negatives (FN) generated by an algorithm or procedure.
The ratios are easier to interpret than the confusion matrix.
For example, the Accuracy is given by the ratio of all accurate responses to the
total number of responses
Another popular ratio is Recall
There is a command-line utility binary-classification-ratios. The utility takes the
optional arguments -tp, -tn, -fp, and -fn,
computes the popular binary-classification ratios such as Accuracy, Recall, Precision and
F1-score and prints them to terminal.
binary-classification-ratios -tp 10 -tn 20 -fp 30 -fn 40
Confusion matrix TP 10 TN 20 FP 30 FN 40
accuracy 0.30000
precision 0.250
recall 0.200
f1-score 0.222The package is designed to be useful in other Python projects where the elements of the confusion matrix are known
from binary_classification_ratios import BinaryClassificationRatios
ratios = BinaryClassificationRatios(tp=10, tn=20, fp=30, fn=40)
print(ratios.get_summary())
ratios.assert_min(0.9, 0.8, 0.7)pip install binary-classification-ratios