Abstract
In the context of a partially linear regression model, shrinkage semiparametric estimation is considered based on the Stein-rule. In this framework, the coefficient vector is partitioned into two sub-vectors: the first sub-vector gives the coefficients of interest, i.e., main effects (for example, treatment effects), and the second sub-vector is for variables that may or may not need to be controlled. When estimating the first sub-vector, the best estimate may be obtained using either the full model that includes both sub-vectors, or the reduced model which leaves out the second sub-vector. It is demonstrated that shrinkage estimators which combine two semiparametric estimators computed for the full model and the reduced model outperform the semiparametric estimator for the full model. Using the semiparametric estimate for the reduced model is best when the second sub-vector is the null vector, but this estimator suffers seriously from bias otherwise. The relative dominance picture of suggested estimators is investigated. In particular, suitability of estimating the nonparametric component based on the B-spline basis function is explored. Further, the performance of the proposed estimators is compared with an absolute penalty estimator through Monte Carlo simulation. Lasso and adaptive lasso were implemented for simultaneous model selection and parameter estimation. A real data example is given to compare the proposed estimators with lasso and adaptive lasso estimators.
Keywords
Partially linear model; James–Stein estimator; Absolute penalty estimation; Lasso; Adaptive lasso; B-spline approximation; Semiparametric model; Monte Carlo simulation
How to cite
Raheem, S. E, Ahmed S. E., Doksum K. A. (2012). Absolute penalty and shrinkage estimation in partially linear models. Computational Statistics and Data Analysis. 56(4):874-891

Related Posts:

  • Scheffé Test When the decision from the One-Way Analysis of Variance is to reject the null hypothesis, it means that at least one of the means isn't the same as the other means. What we need is a way to figure out where the differences … Read More
  • F-Test F-Test The F-distribution is formed by the ratio of two independent chi-square variables divided by their respective degrees of freedom. Since F is formed by chi-square, many of the chi-square properties carry over to… Read More
  • Terminology of Statistics F-distribution The ratio of two independent chi-square variables divided by their respective degrees of freedom. If the population variances are equal, this simplifies to be the ratio of the sample variances. Analysis o… Read More
  • One-Way Analysis of Variance A One-Way Analysis of Variance is a way to test the equality of three or more means at one time by using variances. Assumptions The populations from which the samples were obtained must be normally or approximately norm… Read More
  • Combinations Combinations A combination is an arrangement of objects without repetition where order is not important. Note: The difference between a permutation and a combination is not whether there is repetition or not -- there must… Read More

0 Comments:

Powered by Blogger.

Visitors

226099
Print Friendly Version of this pagePrint Get a PDF version of this webpagePDF


 download University Notes apps for android

Popular Posts

Flag Counter