Pearlmutter, Barak A. and Rosenfeld, Ronald (1991) Chaitin-Kolmogorov Complexity and Generalization in Neural Networks. Advances in Neural Information Processing Systems. pp. 925-931. ISSN 1049-5258
Preview
BP_chaitin kolm.pdf
Download (1MB) | Preview
Abstract
We present a unified framework for a number of different ways of failing
to generalize properly. During learning, sources of random information
contaminate the network, effectively augmenting the training data with
random information. The complexity of the function computed is therefore
increased, and generalization is degraded. We analyze replicated networks,
in which a number of identical networks are independently trained on the
same data and their results averaged. We conclude that replication almost
always results in a decrease in the expected complexity of the network, and
that replication therefore increases expected generalization. Simulations
confirming the effect are also presented.
Item Type: | Article |
---|---|
Keywords: | Chaitin-Kolmogorov Complexity; Neural Networks; |
Academic Unit: | Faculty of Science and Engineering > Computer Science Faculty of Science and Engineering > Research Institutes > Hamilton Institute |
Item ID: | 5536 |
Depositing User: | Barak Pearlmutter |
Date Deposited: | 04 Nov 2014 10:23 |
Journal or Publication Title: | Advances in Neural Information Processing Systems |
Publisher: | Massachusetts Institute of Technology Press (MIT Press) |
Refereed: | Yes |
Related URLs: | |
URI: | https://mu.eprints-hosting.org/id/eprint/5536 |
Use Licence: | This item is available under a Creative Commons Attribution Non Commercial Share Alike Licence (CC BY-NC-SA). Details of this licence are available here |
Repository Staff Only (login required)
Downloads
Downloads per month over past year