Multiple kernel learning algorithms

Mehmet Gonen, Ethem Alpaydin

Research output: Contribution to journalArticle

1018 Citations (Scopus)

Abstract

In recent years, several methods have been proposed to combine multiple kernels instead of using a single one. These different kernels may correspond to using different notions of similarity or may be using information coming from multiple sources (different representations or different feature subsets). In trying to organize and highlight the similarities and differences between them, we give a taxonomy of and review several multiple kernel learning algorithms. We perform experiments on real data sets for better illustration and comparison of existing algorithms. We see that though there may not be large differences in terms of accuracy, there is difference between them in complexity as given by the number of stored support vectors, the sparsity of the solution as given by the number of used kernels, and training time complexity. We see that overall, using multiple kernels instead of a single one is useful and believe that combining kernels in a nonlinear or data-dependent way seems more promising than linear combination in fusing information provided by simple linear kernels, whereas linear methods are more reasonable when combining complex Gaussian kernels. Keywords: support vector machines, kernel machines, multiple kernel learning

Original languageEnglish (US)
Pages (from-to)2211-2268
Number of pages58
JournalJournal of Machine Learning Research
Volume12
StatePublished - Jul 2011
Externally publishedYes

Fingerprint

Taxonomies
Learning algorithms
Support vector machines
Learning Algorithm
kernel
Experiments
Kernel Machines
Gaussian Kernel
Support Vector
Dependent Data
Taxonomy
Sparsity
Time Complexity
Linear Combination
Support Vector Machine
Subset
Experiment

Keywords

  • Kernel machines
  • Multiple kernel learning
  • Support vector machines

ASJC Scopus subject areas

  • Artificial Intelligence
  • Software
  • Control and Systems Engineering
  • Statistics and Probability

Cite this

Multiple kernel learning algorithms. / Gonen, Mehmet; Alpaydin, Ethem.

In: Journal of Machine Learning Research, Vol. 12, 07.2011, p. 2211-2268.

Research output: Contribution to journalArticle

Gonen, Mehmet ; Alpaydin, Ethem. / Multiple kernel learning algorithms. In: Journal of Machine Learning Research. 2011 ; Vol. 12. pp. 2211-2268.
@article{06c05291158843458a133074f72aa1b4,
title = "Multiple kernel learning algorithms",
abstract = "In recent years, several methods have been proposed to combine multiple kernels instead of using a single one. These different kernels may correspond to using different notions of similarity or may be using information coming from multiple sources (different representations or different feature subsets). In trying to organize and highlight the similarities and differences between them, we give a taxonomy of and review several multiple kernel learning algorithms. We perform experiments on real data sets for better illustration and comparison of existing algorithms. We see that though there may not be large differences in terms of accuracy, there is difference between them in complexity as given by the number of stored support vectors, the sparsity of the solution as given by the number of used kernels, and training time complexity. We see that overall, using multiple kernels instead of a single one is useful and believe that combining kernels in a nonlinear or data-dependent way seems more promising than linear combination in fusing information provided by simple linear kernels, whereas linear methods are more reasonable when combining complex Gaussian kernels. Keywords: support vector machines, kernel machines, multiple kernel learning",
keywords = "Kernel machines, Multiple kernel learning, Support vector machines",
author = "Mehmet Gonen and Ethem Alpaydin",
year = "2011",
month = "7",
language = "English (US)",
volume = "12",
pages = "2211--2268",
journal = "Journal of Machine Learning Research",
issn = "1532-4435",
publisher = "Microtome Publishing",

}

TY - JOUR

T1 - Multiple kernel learning algorithms

AU - Gonen, Mehmet

AU - Alpaydin, Ethem

PY - 2011/7

Y1 - 2011/7

N2 - In recent years, several methods have been proposed to combine multiple kernels instead of using a single one. These different kernels may correspond to using different notions of similarity or may be using information coming from multiple sources (different representations or different feature subsets). In trying to organize and highlight the similarities and differences between them, we give a taxonomy of and review several multiple kernel learning algorithms. We perform experiments on real data sets for better illustration and comparison of existing algorithms. We see that though there may not be large differences in terms of accuracy, there is difference between them in complexity as given by the number of stored support vectors, the sparsity of the solution as given by the number of used kernels, and training time complexity. We see that overall, using multiple kernels instead of a single one is useful and believe that combining kernels in a nonlinear or data-dependent way seems more promising than linear combination in fusing information provided by simple linear kernels, whereas linear methods are more reasonable when combining complex Gaussian kernels. Keywords: support vector machines, kernel machines, multiple kernel learning

AB - In recent years, several methods have been proposed to combine multiple kernels instead of using a single one. These different kernels may correspond to using different notions of similarity or may be using information coming from multiple sources (different representations or different feature subsets). In trying to organize and highlight the similarities and differences between them, we give a taxonomy of and review several multiple kernel learning algorithms. We perform experiments on real data sets for better illustration and comparison of existing algorithms. We see that though there may not be large differences in terms of accuracy, there is difference between them in complexity as given by the number of stored support vectors, the sparsity of the solution as given by the number of used kernels, and training time complexity. We see that overall, using multiple kernels instead of a single one is useful and believe that combining kernels in a nonlinear or data-dependent way seems more promising than linear combination in fusing information provided by simple linear kernels, whereas linear methods are more reasonable when combining complex Gaussian kernels. Keywords: support vector machines, kernel machines, multiple kernel learning

KW - Kernel machines

KW - Multiple kernel learning

KW - Support vector machines

UR - http://www.scopus.com/inward/record.url?scp=80052213499&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=80052213499&partnerID=8YFLogxK

M3 - Article

AN - SCOPUS:80052213499

VL - 12

SP - 2211

EP - 2268

JO - Journal of Machine Learning Research

JF - Journal of Machine Learning Research

SN - 1532-4435

ER -