Improved regularization in extreme learning machines

Título: Improved regularization in extreme learning machines

Autores: Kulaif, Andrea Carolina Peres; Von Zuben, Fernando J.

Resumo: Extreme learning machines (ELMs) are one-hidden layer feedforward neural networks designed to be trained at a low computational cost and to exhibit a direct control of the generalization capability, both for regression and classification tasks. By means of ridge regression, it is possible to properly control the norm of the connection weights at the output layer. Restricted to regression tasks and MLP-like topologies, the main contribution of this paper is to indicate that a more refined search for the parameter of the ridge regression, based on a golden section mechanism, consistently guide to better performance in terms of generalization, when compared to what is done in the state-of-the-art proposals for ELMs in the literature, and no matter the number of neurons at the hidden layer. The improvement in regularization comes at an additional computational cost, though we still are dealing solely with the adjustment of the connection weights at the output layer, which implies that the effective cost remains orders of magnitude below what we obtain when training an MLP neural network.

Palavras-chave: Extreme learning machines; golden section search; regression problems; generalization capability

Páginas: 6

Código DOI: 10.21528/CBIC2013-139

Artigo em pdf: bricsccicbic2013_submission_139.pdf

Arquivo BibTex: bricsccicbic2013_submission_139.bib