The multi-index model is a simple yet powerful high-dimensional regression model which circumvents the curse of dimensionality assuming E[Y |X] = g(A⊤X) for some unknown index space A and link function g. In this paper we introduce a method for the estimation of the index space, and study the propagation error of an index space estimate in the regression of the link function. The proposed method approximates the index space by the span of linear regression slope coefficients computed over level sets of the data. Being based on ordinary least squares, our approach is easy to implement and computationally efficient. We prove a tight concentration bound that shows N−1/2-convergence, but also faithfully describes the dependence on the chosen partition of level sets, hence providing guidance on the hyperparameter tuning. The estimator’s competitiveness is confirmed by extensive comparisons with state-of-the-art methods, both on synthetic and real data sets. As a second contribution, we establish minimax optimal generalization bounds for k-nearest neighbors and piecewise polynomial regression when trained on samples projected onto any N−1/2-consistent estimate of the index space, thus providing complete and provable estimation of the multi-index model.
Klock, T., Lanteri, A., Vigogna, S. (2021). Estimating multi-index models with response-conditional least squares. ELECTRONIC JOURNAL OF STATISTICS, 15(1), 589-629 [10.1214/20-EJS1785].
Estimating multi-index models with response-conditional least squares
Vigogna S.
2021-01-01
Abstract
The multi-index model is a simple yet powerful high-dimensional regression model which circumvents the curse of dimensionality assuming E[Y |X] = g(A⊤X) for some unknown index space A and link function g. In this paper we introduce a method for the estimation of the index space, and study the propagation error of an index space estimate in the regression of the link function. The proposed method approximates the index space by the span of linear regression slope coefficients computed over level sets of the data. Being based on ordinary least squares, our approach is easy to implement and computationally efficient. We prove a tight concentration bound that shows N−1/2-convergence, but also faithfully describes the dependence on the chosen partition of level sets, hence providing guidance on the hyperparameter tuning. The estimator’s competitiveness is confirmed by extensive comparisons with state-of-the-art methods, both on synthetic and real data sets. As a second contribution, we establish minimax optimal generalization bounds for k-nearest neighbors and piecewise polynomial regression when trained on samples projected onto any N−1/2-consistent estimate of the index space, thus providing complete and provable estimation of the multi-index model.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.