Beran (2014) introduced hypercube fits to the univariate linear model as a completion of the class of Penalized Least Squares (PLS) fits with quadratic penalties. Hypercube fits include submodel Least Squares fits that are limits of PLS fits as penalty weights tend to infinity. Through control of condition number, they improve the numerical stability of PLS fits. Adaptive hypercube fits that minimize estimated risk can behave asymptotically, as the number of regressors increases, like their oracle counterparts. In particular, suitable adaptive hypercube fits extend to general regression designs the asymptotic risk reduction achieved by multiple Stein shrinkage in balanced orthogonal designs.
This talk describes adaptive hypercube fits to the multivariate linear model. New features in the multivariate case include replacing scalar penalty weights by matrix penalty weights. Suitable adaptive hypercube fits extend to the multivariate linear model the asymptotic risk reduction achieved by multiple Efron-Morris affine shrinkage in balanced orthogonal designs. Reduced risk fits to unbalanced MANOVA designs illustrate.
Reference: Beran, R. Hypercube estimators: Penalized least squares, submodel selection, and numerical stability. Computational Statistics and Data Analysis 71 (2014) 654-666.
Talk from Archives
Hypercube Fits to the Multivariate Linear Model
17.10.2016 16:46 - 17:45
Location:
Sky Lounge OMP1