Parametric complexity is a central concept in Minimum Description Length (MDL) model selection. In practice it often turns out to be infinite, even for quite simple models Such as the Poisson and geometric families. I...
详细信息
Parametric complexity is a central concept in Minimum Description Length (MDL) model selection. In practice it often turns out to be infinite, even for quite simple models Such as the Poisson and geometric families. In such cases, MDL model selection as based on NML and Bayesian inference based on Jeffreys' prior cannot be used. Several ways to resolve this problem have been proposed. We conduct experiments to compare and evaluate their behaviour on small sample sizes. We find interestingly poor behaviour for the plug-in predictive code;a restricted NML model performs quite well but it is questionable if the results validate its theoretical motivation. A Bayesian marginal distribution with Jeffreys' prior can still be used if one sacrifices the first observation to make a proper posterior;this approach turns out to be most dependable. (c) 2005 Elsevier Inc. All rights reserved.
暂无评论