Statistics Seminar

Jelena BradicDept of Math at the University of California at San Diego
Minimax testing and adaptivity in high-dimensional models with non-sparse structures (high-dimensional learning without sparsity)

Wednesday, March 14, 2018 - 4:15pm
Biotech G01

Abstract: In this paper, we focus on hypothesis testing and confidence interval construction in high-dimensional linear models. We develop new concepts of uniform and essentially uniform non-testability that allow the study of limitations of tests across a broad set of alternatives. Uniform non-testability identifies an extensive collection of alternatives such that the power of any test, against any alternative in this group, is asymptotically at most equal to the nominal size. Where minimax proves the existence of one particularly “bad” alternative, we explicitly identify a large set of “bad” alternatives. Implications of the new constructions include new minimax testability results that are in sharp contrast to existing results, do not depend on the sparsity of the model parameters. We identify new tradeoffs between testability and feature correlation. In particular, we show that in models with weak feature correlations a minimax lower bound can be attained by a confidence interval whose width has the parametric rate regardless of the size of the model sparsity. We also discover that whenever the feature correlation is known, inference at the parametric rate is achievable, irrespective of how big that correlation is or how sparse the model is.