Hebbian feature discovery improves classifier efficiency

Todd Leen, Mike Rudnick, Dan Hammerstrom

Research output: Contribution to conferencePaperpeer-review

16 Scopus citations


Two neural network implementations of principal component analysis (PCA) are used to reduce the dimension of speech signals. The compressed signals are then used to train a feedforward classification network for vowel recognition. A comparison is made of classification performance, network size, and training time for networks trained with both compressed and uncompressed data. Results show that a significant reduction in training time, fivefold in the present case, can be achieved without a sacrifice in classifier accuracy. This reduction includes the time required to train the compression network. Thus, dimension reduction, as performed by unsupervised neural networks, is a viable tool for enhancing the efficiency of neural classifiers.

Original languageEnglish (US)
Number of pages6
StatePublished - Dec 1 1990
Event1990 International Joint Conference on Neural Networks - IJCNN 90 - San Diego, CA, USA
Duration: Jun 17 1990Jun 21 1990


Other1990 International Joint Conference on Neural Networks - IJCNN 90
CitySan Diego, CA, USA

ASJC Scopus subject areas

  • Engineering(all)


Dive into the research topics of 'Hebbian feature discovery improves classifier efficiency'. Together they form a unique fingerprint.

Cite this