Classification learner - Confusion matrix

11 views (last 30 days)
EK
EK on 30 Mar 2020
Commented: EK on 3 Apr 2020
I am trying to interpritate the results of confusion matrix (Classification Learner Toolbox) but can not find True Negative rate (TN) and false positive values (FP). I am wondering if FP is related to the 'False Discovery Rates' and 'Positive Predictive Vsalues' could be TN values?

Answers (1)

Athul Prakash
Athul Prakash on 3 Apr 2020
Hi EK,
You are correct about False Discovery Rates being the same as False Positives. However, I'm not sure which metric you mean when you say True Negatives - for a multiclass problem, do you mean that every example that is not labelled as class 'A' and is not predicted as class 'A' would count as a TN for class A (and similarly for every class in your dataset)?
You may calculate this metric by subtracting from the total number of examples, all the TPs FPs and FNs to leave you with TNs - since TP+FP+TN+FN = Total No. of examples.
Alternatively, you may export the model trained in the app to your workspace and then get predictions on your test data. After that, you may use 'confusionmat' to obtain the confusion matrix as a matlab array from which you can calculate each of these 4 metrics manually.
Hope it helps!
  1 Comment
EK
EK on 3 Apr 2020
Hi AP,
thank you for your reply. In Classification Learner Toolbox one can get two types of confusion mtrices. I am attaching it below as A and B. I have two classes S+ and S-. Matrix A gives me TP values S+ 69% S- 72% and FN values S+ 31% and S- 28%. I assume that Matrix B gives FP values S+ 30%,S-29% and TN values S+71, S- 70%. However in matrix B it is signed as a positive predictive values so I am not sure if I am correct in interpreting it as FP and TN.
Thank you

Sign in to comment.

Products


Release

R2019b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!