One of the most prominent methods in high-dimensional statistics is the lasso. Much of the theory for the lasso in the high-dimensional linear model hinges on the so-called effective noise. Among other things, the effective noise plays an important role in finite-sample bounds for the lasso, the calibration of the lasso's tuning parameter, and inference on the unknown parameter vector. In the talk, we develop a bootstrap-based estimator of the quantiles of the effective noise. Based on this estimator, we derive novel methods for tuning parameter calibration and inference for the lasso.
Underlying paper: https://www.jmlr.org/papers/v22/20-539.html
Personal Website of Michael Vogt
The talk also can be joined online via our ZOOM MEETING
Meeting room opens at: May 16, 2022, 4.30 pm Vienna
Meeting ID: 655 6399 0669
Password: 422158