In the ROC generate component, AUC (or ROC score) is frequently calculated. But in many cases the outcomes shows some thing like this (as title of the plot) :
"Accuracy 0.98: Excellent". Now as I understand AUC and Accuracy are two different things, but here i am getting a bit confuse.
For e.g. i built a Bayesian model. The training set LOO-CV results showed: XV ROC AUC value of 0.94, and 'TP=226, TN=236, N=506' [which gives Accuracy of =(226+236)/506=0.91]. The Title of this training set ROC plot says "Accuracy 0.94: Excellent" i.e. it is saying Accuracy but giving a value of AUC.
What am I missing here? Could somebody please clear this.
Thank you
n