I am looking to create a confusion matrix out of a tabled query of the form [query] | table unchanged true pred Where, due to circumstances upstream from me, unchanged is the result if the prediction was correct, true is the ground truth if the prediction was wrong and pred is the prediction if the prediction is wrong. Example tabular output, assuming categories are A, B, C: unchanged | true | pred A | B | | A | B | C | A etc. I would like to accumulate the counts into a confusion matrix. So let's say the classifier categories are A, B, C. The table should count the matches from "unchanged" along the diagonal and put "pred" and "true" into the appropriate off-diagonal. Example table: pred: A B C true: A 12 6 1 B 6 20 2 C 2 3 30 If you are familiar with confusion matrices you will have the idea. How can I generate such a summary table?
... View more