Recently, many authors have proposed new algorithms to improve the accuracy of certain classifiers by assembling a collection of individual classifiers obtained resampling on the training sample. Bagging and boosting are well-known methods in the machine learning context and they have been proved to be successful in classification problems. In the regression context, the application of these techniques has received little investigation. Our aim is to analyse, by simulation studies, when boosting and bagging can reduce the training set error and the generalization error, using nonparametric regression methods as predictors, In this work, we will consider three methods: projection pursuit regression (PPR), multivariate adaptive regression splines (MARS), local learning based on recursive covering (DART). (C) 2002 Elsevier Science B.V. All rights reserved.
Borra S., D.C.A. (2002). Improving nonparametric regression methods by bagging and boosting. In Computational Statistics and Data Analysis (pp.407-420). AMSTERDAM : ELSEVIER SCIENCE BV.
Autori: | |
Autori: | Borra S., Di Ciaccio A.; |
Titolo: | Improving nonparametric regression methods by bagging and boosting |
Nome del convegno: | Meeting on Nonlinear Methods and data Mining_(NMDM2000) |
Luogo del convegno: | ROME, ITALY |
Anno del convegno: | SEP 25-26, 2000 |
Rilevanza: | Rilevanza internazionale |
Sezione: | su invito |
Data di pubblicazione: | 2002 |
Digital Object Identifier (DOI): | http://dx.doi.org/10.1016/S0167-9473(01)00068-8 |
Settore Scientifico Disciplinare: | Settore SECS-S/01 - Statistica |
Lingua: | English |
Tipologia: | Intervento a convegno |
Citazione: | Borra S., D.C.A. (2002). Improving nonparametric regression methods by bagging and boosting. In Computational Statistics and Data Analysis (pp.407-420). AMSTERDAM : ELSEVIER SCIENCE BV. |
Appare nelle tipologie: | 02 - Intervento a convegno |