In this paper, we are interested in the problem of smoothing parameter selection in nonparametric curve estimation under dependent errors. We focus on kernel estimation and the case when the errors form a general stat...
详细信息
In this paper, we are interested in the problem of smoothing parameter selection in nonparametric curve estimation under dependent errors. We focus on kernel estimation and the case when the errors form a general stationary sequence of martingale difference random variables where neither linearity assumption nor "all moments are finite" are required. We compare the behaviors of the smoothing bandwidths obtained by minimizing either the unknown average squared error, the theoretical mean average squared error, a Mallows-type criterion adapted to the dependent case and the family of criteria known as generalized cross validation (GCV) extensions of the Mallows' criterion. We prove that these three minimizers and those based on the GCV family are first-order equivalent in probability. We give also a normal asymptotic behavior of the gap between the minimizer of the average squared error and that of the Mallows-type criterion. This is extended to the GCV family. Finally, we apply our theoretical results to a specific case of martingale difference sequence, namely the Auto-Regressive Conditional Heteroscedastic (ARCH(1)) process. A Monte-Carlo simulation study, for this regression model with ARCH(1) process, is conducted.
The prediction error (average squared error) is the most commonly used performance criterion for the assessment of nonparametric regression estimators. However, there has been little investigation of the properties of...
详细信息
The prediction error (average squared error) is the most commonly used performance criterion for the assessment of nonparametric regression estimators. However, there has been little investigation of the properties of the criterion itself. This paper shows that in certain situations the prediction error can be very misleading because it fails to discriminate an extreme undersmoothed estimate from a good estimate. For spline smoothing, we show, using asymptotic analysis and simulations, that there is poor discrimination of extreme undersmoothing in the following situations: small sample size or small error variance or a function with high curvature. To overcome this problem, we propose using the Sobolev error criterion. For spline smoothing, it is shown asymptotically and by simulations that the Sobolev error is significantly better than the prediction error in discriminating extreme undersmoothing. Similar results hold for other nonparametric regression estimators and for multivariate smoothing. For thin-plate smoothing splines, the prediction error's poor discrimination of extreme undersmoothing becomes significantly worse with increasing dimension. (C) 2014 Elsevier B.V. All rights reserved.
The one-sided cross-validation (OSCV) method is shown to be robust to lack of smoothness in the regression function. Two corrections for the case where the regression function has a discontinuous first derivative are ...
详细信息
The one-sided cross-validation (OSCV) method is shown to be robust to lack of smoothness in the regression function. Two corrections for the case where the regression function has a discontinuous first derivative are proposed. Simulation results suggest that proposed modifications of the OSCV method are efficient for regression functions whose first derivative is discontinuous at more than two points. The OSCV method and its modification outperform the cross-validation method and the Ruppert-Sheather-Wand plug-in method in a data example involving a function that, potentially, has one discontinuity in its derivative.
The effects of moderate levels of serial correlation on one-sided and ordinary cross-validation in the context of local linear and kernel smoothing is investigated. It is shown both theoretically and by simulation tha...
详细信息
The effects of moderate levels of serial correlation on one-sided and ordinary cross-validation in the context of local linear and kernel smoothing is investigated. It is shown both theoretically and by simulation that one-sided cross-validation is much less adversely affected by correlation than is ordinary cross-validation. The former method is a reliable means of window width selection in the presence of moderate levels of serial correlation, while the latter is not. It is also shown that ordinary cross-validation is less robust to correlation when applied to Gasser-Muller kernel estimators than to local linear ones. (C) 2003 Elsevier Inc. All rights reserved.
A new method of selecting the smoothing parameters of nonparametric regression estimators is introduced. The method, termed one-sided cross-validation (OSCV), has the objectivity of cross-validation and statistical pr...
详细信息
A new method of selecting the smoothing parameters of nonparametric regression estimators is introduced. The method, termed one-sided cross-validation (OSCV), has the objectivity of cross-validation and statistical properties comparable to those of a plug-in rule. The new method may be viewed as an application of the prequential model selection method of Dawid. As such, our results identify a situation in which the prequential method is a more efficient model selector than cross-validation. An example, simulations, and theoretical results demonstrate the utility of OSCV when used with local linear and kernel estimators.
Given the model Y i =m(χ i )+e{open}i,where E(e{open} i) =0, X i ≠Ci=1, ..., n, and C is a p-dimensional compact set, we have designed a new method for testing the hypothesis that the regression function follows a g...
详细信息
The average squared error has been suggested earlier as an appropriate estimate of the integrated squarederror, but an example is given which shows their ratio can tend to infinity. The results of a Monte Carlo study...
详细信息
暂无评论