Chen-Pin Wang Goutis-Robert Kullback-Leibler Divergence in Generalized Linear Models and Beyond: Applications to Studies in Type 2 Diabetes
University od Texas Health Science Center at San Antonio
This talk considers the Kullback-Leibler distance (KLD) by Goutis and Robert (1998)
originally proposed for comparing nested generalized linear models.
We derive the asymptotic properties of this KLD under certain regularity
conditions where neither models in comparison is required to be the true model.
We also examine the impact of this asymptotic property when the regularity
conditions are not completely satisfied. Furthermore, we establish the connection
between Goutis and Robert 's KLD and a weighted posterior predictive p-value (WPPP).
Finally, we apply both KLD and WPPP to compare models using simulation studies as well
as two type 2 diabetes studies, where only part of the regularity conditions were met.