American Statistical Association
Scaled sparse linear regression jointly estimates the regression coefficients and noise level in a linear model. It chooses an equilibrium with a sparse regression method by iteratively estimating the noise level via the mean residual squares and scaling the penalty in proportion to the estimated noise level. The iterative algorithm costs nearly nothing beyond the computation of a path of the sparse regression estimator for penalty levels above a threshold. For the scaled Lasso, the algorithm is a gradient descent in a convex minimization of a penalized joint loss function for the regression coefficients and noise level. Under mild regularity conditions, we prove that the method yields simultaneously an estimator for the noise level and an estimated coefficient vector in the Lasso path satisfying certain oracle inequalities for the estimation of the noise level, prediction, and the estimation of regression coefficients. These oracle inequalities provide sufficient conditions for the consistency and asymptotic normality of the estimator for the noise level, including cases where the number of variables is of greater order than the sample size. Numerical results demonstrate the superior performance of the proposed method over an earlier proposal of joint convex minimization.
Dr. Cun-Hui Zhang earned his Ph.D. degree from the Department of Statistics at Columbia University. His research interests include empirical Bayes methods, survival analysis, statistical inference and probability theory. He is a fellow of the IMS.
|Date:||Thursday, May 5, 2011|
|Time:||4:00 - 5:00 P.M.|
Mailman School of Public Health
Department of Biostatistics
722 West 168th Street
Biostatistics Computer Lab
6th Floor - Room 656
New York, New York