Multiclass posterior probability support vector machines

Mehmet Gonen, Ayşe Gönül Tanuǧur, Ethem Alpaydin

Research output: Contribution to journalArticle

50 Citations (Scopus)

Abstract

Tao et al. have recently proposed the posterior probability support vector machine (PPSVM) which uses soft labels derived from estimated posterior probabilities to be more robust to noise and outliers. Tao et al.'s model uses a window-based density estimator to calculate the posterior probabilities and is a binary classifier. We propose a neighbor-based density estimator and also extend the model to the multiclass case. Our bias-variance analysis shows that the decrease in error by PPSVM is due to a decrease in bias. On 20 benchmark data sets, we observe that PPSVM obtains accuracy results that are higher or comparable to those of canonical SVM using significantly fewer support vectors.

Original languageEnglish (US)
Pages (from-to)130-139
Number of pages10
JournalIEEE Transactions on Neural Networks
Volume19
Issue number1
DOIs
StatePublished - Jan 2008
Externally publishedYes

Fingerprint

Posterior Probability
Multi-class
Support vector machines
Support Vector Machine
Density Estimator
Decrease
Support Vector
Outlier
Labels
Classifiers
Classifier
Binary
Benchmark
Calculate
Model

Keywords

  • Density estimation
  • Kernel machines
  • Multiclass classification
  • Support vector machines (SVMs)

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Theoretical Computer Science
  • Electrical and Electronic Engineering
  • Artificial Intelligence
  • Computational Theory and Mathematics
  • Hardware and Architecture

Cite this

Multiclass posterior probability support vector machines. / Gonen, Mehmet; Tanuǧur, Ayşe Gönül; Alpaydin, Ethem.

In: IEEE Transactions on Neural Networks, Vol. 19, No. 1, 01.2008, p. 130-139.

Research output: Contribution to journalArticle

Gonen, Mehmet ; Tanuǧur, Ayşe Gönül ; Alpaydin, Ethem. / Multiclass posterior probability support vector machines. In: IEEE Transactions on Neural Networks. 2008 ; Vol. 19, No. 1. pp. 130-139.
@article{9c290b75a835431589cc5afc24be5e8b,
title = "Multiclass posterior probability support vector machines",
abstract = "Tao et al. have recently proposed the posterior probability support vector machine (PPSVM) which uses soft labels derived from estimated posterior probabilities to be more robust to noise and outliers. Tao et al.'s model uses a window-based density estimator to calculate the posterior probabilities and is a binary classifier. We propose a neighbor-based density estimator and also extend the model to the multiclass case. Our bias-variance analysis shows that the decrease in error by PPSVM is due to a decrease in bias. On 20 benchmark data sets, we observe that PPSVM obtains accuracy results that are higher or comparable to those of canonical SVM using significantly fewer support vectors.",
keywords = "Density estimation, Kernel machines, Multiclass classification, Support vector machines (SVMs)",
author = "Mehmet Gonen and Tanuǧur, {Ayşe G{\"o}n{\"u}l} and Ethem Alpaydin",
year = "2008",
month = "1",
doi = "10.1109/TNN.2007.903157",
language = "English (US)",
volume = "19",
pages = "130--139",
journal = "IEEE Transactions on Neural Networks and Learning Systems",
issn = "2162-237X",
publisher = "IEEE Computational Intelligence Society",
number = "1",

}

TY - JOUR

T1 - Multiclass posterior probability support vector machines

AU - Gonen, Mehmet

AU - Tanuǧur, Ayşe Gönül

AU - Alpaydin, Ethem

PY - 2008/1

Y1 - 2008/1

N2 - Tao et al. have recently proposed the posterior probability support vector machine (PPSVM) which uses soft labels derived from estimated posterior probabilities to be more robust to noise and outliers. Tao et al.'s model uses a window-based density estimator to calculate the posterior probabilities and is a binary classifier. We propose a neighbor-based density estimator and also extend the model to the multiclass case. Our bias-variance analysis shows that the decrease in error by PPSVM is due to a decrease in bias. On 20 benchmark data sets, we observe that PPSVM obtains accuracy results that are higher or comparable to those of canonical SVM using significantly fewer support vectors.

AB - Tao et al. have recently proposed the posterior probability support vector machine (PPSVM) which uses soft labels derived from estimated posterior probabilities to be more robust to noise and outliers. Tao et al.'s model uses a window-based density estimator to calculate the posterior probabilities and is a binary classifier. We propose a neighbor-based density estimator and also extend the model to the multiclass case. Our bias-variance analysis shows that the decrease in error by PPSVM is due to a decrease in bias. On 20 benchmark data sets, we observe that PPSVM obtains accuracy results that are higher or comparable to those of canonical SVM using significantly fewer support vectors.

KW - Density estimation

KW - Kernel machines

KW - Multiclass classification

KW - Support vector machines (SVMs)

UR - http://www.scopus.com/inward/record.url?scp=39549091332&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=39549091332&partnerID=8YFLogxK

U2 - 10.1109/TNN.2007.903157

DO - 10.1109/TNN.2007.903157

M3 - Article

C2 - 18269944

AN - SCOPUS:39549091332

VL - 19

SP - 130

EP - 139

JO - IEEE Transactions on Neural Networks and Learning Systems

JF - IEEE Transactions on Neural Networks and Learning Systems

SN - 2162-237X

IS - 1

ER -