Feature selection by independent component analysis and mutual information maximization in EEG signal classification

Tian Lan, Deniz Erdogmus, Andre Adami, Michael Pavel

Research output: Chapter in Book/Report/Conference proceedingConference contribution

21 Scopus citations

Abstract

Feature selection and dimensionality reduction are important steps in pattern recognition. In this paper, we propose a scheme for feature selection using linear independent component analysis and mutual information maximization method. The method is theoretically motivated by the fact that the classification error rate is related to the mutual information between the feature vectors and the class labels. The feasibility of the principle is illustrated on a synthetic dataset and its performance is demonstrated using EEG signal classification. Experimental results show that this method works well for feature selection.

Original languageEnglish (US)
Title of host publicationProceedings of the International Joint Conference on Neural Networks, IJCNN 2005
Pages3011-3016
Number of pages6
DOIs
StatePublished - 2005
EventInternational Joint Conference on Neural Networks, IJCNN 2005 - Montreal, QC, Canada
Duration: Jul 31 2005Aug 4 2005

Publication series

NameProceedings of the International Joint Conference on Neural Networks
Volume5

Other

OtherInternational Joint Conference on Neural Networks, IJCNN 2005
Country/TerritoryCanada
CityMontreal, QC
Period7/31/058/4/05

Keywords

  • Brain-Computer Interface
  • EEG
  • Entropy Estimation
  • Feature Selection
  • Independent Component Analysis
  • Mutual Information

ASJC Scopus subject areas

  • Software
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Feature selection by independent component analysis and mutual information maximization in EEG signal classification'. Together they form a unique fingerprint.

Cite this