Skip to content

Article Published: Separating the Odds: Thresholds for Entropy in Logistic Regression

Drs. Brandi A. Weiss and William Dardick published an article titled, "Separating the odds: Thresholds for entropy in logistic regression" in the Journal of Experimental Education (DOI: 10.1080/00220973.2019.1587735).

Researchers are often reluctant to rely on classification rates because a model with favorable classification rates but poor separation may not replicate well. In comparison, entropy captures information about borderline cases unlikely to generalize to the population. In logistic regression, the correctness of predicted group membership is known, however, this information has not yet been utilized in entropy calculations. The purpose of this study was to, 1) introduce three new variants of entropy as approximate-model-fit measures, 2) establish rule-of-thumb thresholds to determine whether a theoretical model fits the data, and 3) investigate empirical Type I error and statistical power associated with those thresholds. Results are presented from two Monte Carlo simulations. Simulation results indicated that EFR-rescaled was the most representative of overall model effect size, whereas EFR provided the most intuitive interpretation for all group size ratios. Empirically-derived thresholds are provided.

Keywords: Classificationcut-point methodsentropylogistic regressionmodel-fitmisclassification

Leave a Reply

Your email address will not be published. Required fields are marked *