ROC curve gives visual cue for optimal TPR FPR in classification model

  • ROC curve makes it easy to visualize Sensitivity and Specificity and find optimal balance between the two in classification model

    • ROC is a plot of FPR against TPR. FPR (False Positive Rate) is calculated as 1-Specificity while TPR (True Positive Rate) is Sensitivity. Both of which are the cumulative. refer to Confusion MatrixConfusion Matrix


      Confusion Matrix is the visual representation of the Actual vs Predicted values.

      It group and measures the performance of classification prediction model and looks like a table-l...
      • TPR is ‘The fraction of patients with heart disease which are correctly identified’.
      • FPR is ‘The fraction of patients without heart disease which are incorrectly identified as with heart disease’.
    • By increasing the threshold TPR, FPR decreases but, specificity increases. Inversely, by decreasing the threshold, the value of TPR, FPR increases, and specificity decreases.
    • the plot gives a quicker visual cues to find desired thresh-hold, as alternative to generating Confusion MatrixConfusion Matrix


      Confusion Matrix is the visual representation of the Actual vs Predicted values.

      It group and measures the performance of classification prediction model and looks like a table-l...
      for different thresh-holds
      • Most of the time, the top-left value on the ROC curve should give you a quite good threshold,
    • Depend on how many false positives you are willing to accept, you decide the optimal threshold. For example, if you don’t want to have too many false positives, you should have a high threshold value. This will, however, also give you a lot more false negatives.
  • In logistic regression model, the cut-off would be the probability but in simple binary classification (e.g. Titanic dataset predict by age) it’s the age

    • ROC curve is plot by calculating TP, FP, TN and FN for strategic intervals of cut off or threshold
        • the formula in the screen is incorrect – but the concept author meant is right
    • for my study on UTC vs $x$ of Price to predict positive NPV, the $x$ would be the cut-off or threshold, while different model can be Price = Average Price or Price = Discounted Price

References

  • [Understand ROC curve and AUC with Excel INDXAR](https://indxar.com/contents/1274)
  • [What Is ROC Curve?. In machine learning, ROC curve is an… by Saurav Jadhav Analytics Vidhya Medium](https://medium.com/analytics-vidhya/what-is-roc-curve-1f776103c998)
  • [Classification: ROC Curve and AUC     Machine Learning     Google Developers](https://developers.google.com/machine-learning/crash-course/classification/roc-and-auc)
  • How to Interpret a ROC Curve (With Examples) - Statology

Metadata

  • topic:: 00 Statistics00 Statistics
    #MOC / Hub for notes related to general statistical knowledge
  • updated:: 2022-08-05 Private or Broken Links
    The page you're looking for is either not available or private!
  • reviewed:: 2022-08-05 Private or Broken Links
    The page you're looking for is either not available or private!
  • #PermanentNote