For multi-class datasets, the report provides both macro-averaged (equal weight to each class) and micro-averaged (equal weight to each instance) scores.
The tool processes classification data to build a matrix where rows represent actual classes and columns represent predicted classes. Definition Importance Correctly predicted positive instances. Measures successful identification. False Positives (FP) Incorrectly predicted positive instances. Indicates "Type I" error (false alarm). False Negatives (FN) Missed positive instances. Indicates "Type II" error (missed detection). Precision Shows how reliable the positive predictions are. Recall (Sensitivity) Shows how many actual positives the model found. Usage Instructions
The program will print a formatted matrix to the console along with a summary report for each class. Advanced Analysis Features Confusion-0.5-win.zip
Detailed performance reports including Sensitivity, Specificity, Precision, and F-score. Core Components & Metrics
The file is the Windows distribution for Confusion v0.5 , a specialized machine learning tool used to generate and analyze confusion matrices . Measures successful identification
Create a text or CSV file where each line contains the Actual Class followed by the Predicted Class (e.g., Class_A Class_B ).
Unzip Confusion-0.5-win.zip to a local folder. False Negatives (FN) Missed positive instances
Open a terminal (CMD or PowerShell) in the extraction directory and execute: confusion.exe Use code with caution. Copied to clipboard