SECA: a stepwise algorithm for construction of neural networks ensembles
Ensembles of artificial neural networks (ANN) have been used in the last years as classification/regression machines, showing improved generalization capabilities that outperform those of single networks. However, it has been recognized that for aggregation to be effective the individual networks mu...
Guardado en:
Autores principales: | , , , |
---|---|
Formato: | Objeto de conferencia |
Lenguaje: | Inglés |
Publicado: |
2001
|
Materias: | |
Acceso en línea: | http://sedici.unlp.edu.ar/handle/10915/23398 |
Aporte de: |
id |
I19-R120-10915-23398 |
---|---|
record_format |
dspace |
institution |
Universidad Nacional de La Plata |
institution_str |
I-19 |
repository_str |
R-120 |
collection |
SEDICI (UNLP) |
language |
Inglés |
topic |
Ciencias Informáticas Neural nets Algorithms ARTIFICIAL INTELLIGENCE machine learning ensemble methods |
spellingShingle |
Ciencias Informáticas Neural nets Algorithms ARTIFICIAL INTELLIGENCE machine learning ensemble methods Granitto, Pablo Miguel Verdes, Pablo Fabián Ceccatto, Hermenegildo Alejandro Navone, Hugo Daniel SECA: a stepwise algorithm for construction of neural networks ensembles |
topic_facet |
Ciencias Informáticas Neural nets Algorithms ARTIFICIAL INTELLIGENCE machine learning ensemble methods |
description |
Ensembles of artificial neural networks (ANN) have been used in the last years as classification/regression machines, showing improved generalization capabilities that outperform those of single networks. However, it has been recognized that for aggregation to be effective the individual networks must be as accurate and diverse as possible. An important problem is, then, how to tune the aggregate members in order to have an optimal compromise between these two conflicting conditions. Recently, we proposed a new method for constructing ANN ensembles —termed here Stepwise Ensemble Construction Algorithm (SECA)— which leads to overtrained aggregate members with an adequate balance between accuracy and diversity. We present here a more extensive evaluation of SECA and discuss a potential problem with this algorithm: the unfrequent but damaging selection through its heuristic of particularly bad ensemble members. We introduce a modified version of SECA that can cope with this problem by allowing individual weighing of aggregate members. The original algorithm and its weighed modification are favorably tested against other methods, producing an improvement in performance on the standard statistical databases used as benchmarks. |
format |
Objeto de conferencia Objeto de conferencia |
author |
Granitto, Pablo Miguel Verdes, Pablo Fabián Ceccatto, Hermenegildo Alejandro Navone, Hugo Daniel |
author_facet |
Granitto, Pablo Miguel Verdes, Pablo Fabián Ceccatto, Hermenegildo Alejandro Navone, Hugo Daniel |
author_sort |
Granitto, Pablo Miguel |
title |
SECA: a stepwise algorithm for construction of neural networks ensembles |
title_short |
SECA: a stepwise algorithm for construction of neural networks ensembles |
title_full |
SECA: a stepwise algorithm for construction of neural networks ensembles |
title_fullStr |
SECA: a stepwise algorithm for construction of neural networks ensembles |
title_full_unstemmed |
SECA: a stepwise algorithm for construction of neural networks ensembles |
title_sort |
seca: a stepwise algorithm for construction of neural networks ensembles |
publishDate |
2001 |
url |
http://sedici.unlp.edu.ar/handle/10915/23398 |
work_keys_str_mv |
AT granittopablomiguel secaastepwisealgorithmforconstructionofneuralnetworksensembles AT verdespablofabian secaastepwisealgorithmforconstructionofneuralnetworksensembles AT ceccattohermenegildoalejandro secaastepwisealgorithmforconstructionofneuralnetworksensembles AT navonehugodaniel secaastepwisealgorithmforconstructionofneuralnetworksensembles |
bdutipo_str |
Repositorios |
_version_ |
1764820465881186305 |