Hebbian feature discovery improves classifier efficiency

Todd Leen, Mike Rudnick, Dan Hammerstrom

Research output: Contribution to conferencePaper

14 Scopus citations

Abstract

Two neural network implementations of principal component analysis (PCA) are used to reduce the dimension of speech signals. The compressed signals are then used to train a feedforward classification network for vowel recognition. A comparison is made of classification performance, network size, and training time for networks trained with both compressed and uncompressed data. Results show that a significant reduction in training time, fivefold in the present case, can be achieved without a sacrifice in classifier accuracy. This reduction includes the time required to train the compression network. Thus, dimension reduction, as performed by unsupervised neural networks, is a viable tool for enhancing the efficiency of neural classifiers.

Original languageEnglish (US)
Pages51-56
Number of pages6
StatePublished - Dec 1 1990
Event1990 International Joint Conference on Neural Networks - IJCNN 90 - San Diego, CA, USA
Duration: Jun 17 1990Jun 21 1990

Other

Other1990 International Joint Conference on Neural Networks - IJCNN 90
CitySan Diego, CA, USA
Period6/17/906/21/90

ASJC Scopus subject areas

  • Engineering(all)

Fingerprint Dive into the research topics of 'Hebbian feature discovery improves classifier efficiency'. Together they form a unique fingerprint.

  • Cite this

    Leen, T., Rudnick, M., & Hammerstrom, D. (1990). Hebbian feature discovery improves classifier efficiency. 51-56. Paper presented at 1990 International Joint Conference on Neural Networks - IJCNN 90, San Diego, CA, USA, .