Dimension Reduction by Local Principal Component Analysis

Nandakishore Kambhatla, Todd K. Leen

Research output: Contribution to journalArticle

404 Citations (Scopus)

Abstract

Reducing or eliminating statistical redundancy between the components of high-dimensional vector data enables a lower-dimensional representation without significant loss of information. Recognizing the limitations of principal component analysis (PCA), researchers in the statistics and neural network communities have developed nonlinear extensions of PCA. This article develops a local linear approach to dimension reduction that provides accurate representations and is fast to compute. We exercise the algorithms on speech and image data, and compare performance with PCA and with neural network implementations of nonlinear PCA. We find that both nonlinear techniques can provide more accurate representations than PCA and show that the local linear techniques outperform neural network implementations.

Original languageEnglish (US)
Pages (from-to)1493-1516
Number of pages24
JournalNeural Computation
Volume9
Issue number7
StatePublished - Oct 1 1997

Fingerprint

Principal Component Analysis
Principal component analysis
Neural networks
Redundancy
Research Personnel
Dimension Reduction
Statistics
Neural Networks

ASJC Scopus subject areas

  • Artificial Intelligence
  • Control and Systems Engineering
  • Neuroscience(all)

Cite this

Kambhatla, N., & Leen, T. K. (1997). Dimension Reduction by Local Principal Component Analysis. Neural Computation, 9(7), 1493-1516.

Dimension Reduction by Local Principal Component Analysis. / Kambhatla, Nandakishore; Leen, Todd K.

In: Neural Computation, Vol. 9, No. 7, 01.10.1997, p. 1493-1516.

Research output: Contribution to journalArticle

Kambhatla, N & Leen, TK 1997, 'Dimension Reduction by Local Principal Component Analysis', Neural Computation, vol. 9, no. 7, pp. 1493-1516.
Kambhatla N, Leen TK. Dimension Reduction by Local Principal Component Analysis. Neural Computation. 1997 Oct 1;9(7):1493-1516.
Kambhatla, Nandakishore ; Leen, Todd K. / Dimension Reduction by Local Principal Component Analysis. In: Neural Computation. 1997 ; Vol. 9, No. 7. pp. 1493-1516.
@article{3395faa9a4fd442a81f3e9ada93e56f9,
title = "Dimension Reduction by Local Principal Component Analysis",
abstract = "Reducing or eliminating statistical redundancy between the components of high-dimensional vector data enables a lower-dimensional representation without significant loss of information. Recognizing the limitations of principal component analysis (PCA), researchers in the statistics and neural network communities have developed nonlinear extensions of PCA. This article develops a local linear approach to dimension reduction that provides accurate representations and is fast to compute. We exercise the algorithms on speech and image data, and compare performance with PCA and with neural network implementations of nonlinear PCA. We find that both nonlinear techniques can provide more accurate representations than PCA and show that the local linear techniques outperform neural network implementations.",
author = "Nandakishore Kambhatla and Leen, {Todd K.}",
year = "1997",
month = "10",
day = "1",
language = "English (US)",
volume = "9",
pages = "1493--1516",
journal = "Neural Computation",
issn = "0899-7667",
publisher = "MIT Press Journals",
number = "7",

}

TY - JOUR

T1 - Dimension Reduction by Local Principal Component Analysis

AU - Kambhatla, Nandakishore

AU - Leen, Todd K.

PY - 1997/10/1

Y1 - 1997/10/1

N2 - Reducing or eliminating statistical redundancy between the components of high-dimensional vector data enables a lower-dimensional representation without significant loss of information. Recognizing the limitations of principal component analysis (PCA), researchers in the statistics and neural network communities have developed nonlinear extensions of PCA. This article develops a local linear approach to dimension reduction that provides accurate representations and is fast to compute. We exercise the algorithms on speech and image data, and compare performance with PCA and with neural network implementations of nonlinear PCA. We find that both nonlinear techniques can provide more accurate representations than PCA and show that the local linear techniques outperform neural network implementations.

AB - Reducing or eliminating statistical redundancy between the components of high-dimensional vector data enables a lower-dimensional representation without significant loss of information. Recognizing the limitations of principal component analysis (PCA), researchers in the statistics and neural network communities have developed nonlinear extensions of PCA. This article develops a local linear approach to dimension reduction that provides accurate representations and is fast to compute. We exercise the algorithms on speech and image data, and compare performance with PCA and with neural network implementations of nonlinear PCA. We find that both nonlinear techniques can provide more accurate representations than PCA and show that the local linear techniques outperform neural network implementations.

UR - http://www.scopus.com/inward/record.url?scp=0348139702&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0348139702&partnerID=8YFLogxK

M3 - Article

AN - SCOPUS:0348139702

VL - 9

SP - 1493

EP - 1516

JO - Neural Computation

JF - Neural Computation

SN - 0899-7667

IS - 7

ER -