Hebbian feature discovery improves classifier efficiency

Todd Leen, Mike Rudnick, Dan Hammerstrom

Research output: Chapter in Book/Report/Conference proceedingConference contribution

15 Citations (Scopus)

Abstract

Two neural network implementations of principal component analysis (PCA) are used to reduce the dimension of speech signals. The compressed signals are then used to train a feedforward classification network for vowel recognition. A comparison is made of classification performance, network size, and training time for networks trained with both compressed and uncompressed data. Results show that a significant reduction in training time, fivefold in the present case, can be achieved without a sacrifice in classifier accuracy. This reduction includes the time required to train the compression network. Thus, dimension reduction, as performed by unsupervised neural networks, is a viable tool for enhancing the efficiency of neural classifiers.

Original languageEnglish (US)
Title of host publication90 Int Jt Conf Neural Networks IJCNN 90
PublisherPubl by IEEE
Pages51-56
Number of pages6
StatePublished - 1990
Event1990 International Joint Conference on Neural Networks - IJCNN 90 - San Diego, CA, USA
Duration: Jun 17 1990Jun 21 1990

Other

Other1990 International Joint Conference on Neural Networks - IJCNN 90
CitySan Diego, CA, USA
Period6/17/906/21/90

Fingerprint

Classifiers
Neural networks
Network performance
Principal component analysis

ASJC Scopus subject areas

  • Engineering(all)

Cite this

Leen, T., Rudnick, M., & Hammerstrom, D. (1990). Hebbian feature discovery improves classifier efficiency. In 90 Int Jt Conf Neural Networks IJCNN 90 (pp. 51-56). Publ by IEEE.

Hebbian feature discovery improves classifier efficiency. / Leen, Todd; Rudnick, Mike; Hammerstrom, Dan.

90 Int Jt Conf Neural Networks IJCNN 90. Publ by IEEE, 1990. p. 51-56.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Leen, T, Rudnick, M & Hammerstrom, D 1990, Hebbian feature discovery improves classifier efficiency. in 90 Int Jt Conf Neural Networks IJCNN 90. Publ by IEEE, pp. 51-56, 1990 International Joint Conference on Neural Networks - IJCNN 90, San Diego, CA, USA, 6/17/90.
Leen T, Rudnick M, Hammerstrom D. Hebbian feature discovery improves classifier efficiency. In 90 Int Jt Conf Neural Networks IJCNN 90. Publ by IEEE. 1990. p. 51-56
Leen, Todd ; Rudnick, Mike ; Hammerstrom, Dan. / Hebbian feature discovery improves classifier efficiency. 90 Int Jt Conf Neural Networks IJCNN 90. Publ by IEEE, 1990. pp. 51-56
@inproceedings{d3f3c67ecf3e483eb924da91da7fd3fb,
title = "Hebbian feature discovery improves classifier efficiency",
abstract = "Two neural network implementations of principal component analysis (PCA) are used to reduce the dimension of speech signals. The compressed signals are then used to train a feedforward classification network for vowel recognition. A comparison is made of classification performance, network size, and training time for networks trained with both compressed and uncompressed data. Results show that a significant reduction in training time, fivefold in the present case, can be achieved without a sacrifice in classifier accuracy. This reduction includes the time required to train the compression network. Thus, dimension reduction, as performed by unsupervised neural networks, is a viable tool for enhancing the efficiency of neural classifiers.",
author = "Todd Leen and Mike Rudnick and Dan Hammerstrom",
year = "1990",
language = "English (US)",
pages = "51--56",
booktitle = "90 Int Jt Conf Neural Networks IJCNN 90",
publisher = "Publ by IEEE",

}

TY - GEN

T1 - Hebbian feature discovery improves classifier efficiency

AU - Leen, Todd

AU - Rudnick, Mike

AU - Hammerstrom, Dan

PY - 1990

Y1 - 1990

N2 - Two neural network implementations of principal component analysis (PCA) are used to reduce the dimension of speech signals. The compressed signals are then used to train a feedforward classification network for vowel recognition. A comparison is made of classification performance, network size, and training time for networks trained with both compressed and uncompressed data. Results show that a significant reduction in training time, fivefold in the present case, can be achieved without a sacrifice in classifier accuracy. This reduction includes the time required to train the compression network. Thus, dimension reduction, as performed by unsupervised neural networks, is a viable tool for enhancing the efficiency of neural classifiers.

AB - Two neural network implementations of principal component analysis (PCA) are used to reduce the dimension of speech signals. The compressed signals are then used to train a feedforward classification network for vowel recognition. A comparison is made of classification performance, network size, and training time for networks trained with both compressed and uncompressed data. Results show that a significant reduction in training time, fivefold in the present case, can be achieved without a sacrifice in classifier accuracy. This reduction includes the time required to train the compression network. Thus, dimension reduction, as performed by unsupervised neural networks, is a viable tool for enhancing the efficiency of neural classifiers.

UR - http://www.scopus.com/inward/record.url?scp=0025561149&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0025561149&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:0025561149

SP - 51

EP - 56

BT - 90 Int Jt Conf Neural Networks IJCNN 90

PB - Publ by IEEE

ER -