Localized multiple kernel regression

Mehmet Gönen, Ethem Alpaydin

Research output: Chapter in Book/Report/Conference proceedingConference contribution

27 Scopus citations


Multiple kernel learning (MKL) uses a weighted combination of kernels where the weight of each kernel is optimized during training. However, MKL assigns the same weight to a kernel over the whole input space. Our main objective is the formulation of the localized multiple kernel learning (LMKL) framework that allows kernels to be combined with different weights in different regions of the input space by using a gating model. In this paper, we apply the LMKL framework to regression estimation and derive a learning algorithm for this extension. Canonical support vector regression may overfit unless the kernel parameters are selected appropriately; we see that even if provide more kernels than necessary, LMKL uses only as many as needed and does not overfit due to its inherent regularization.

Original languageEnglish (US)
Title of host publicationProceedings - 2010 20th International Conference on Pattern Recognition, ICPR 2010
Number of pages4
StatePublished - Nov 18 2010
Event2010 20th International Conference on Pattern Recognition, ICPR 2010 - Istanbul, Turkey
Duration: Aug 23 2010Aug 26 2010

Publication series

NameProceedings - International Conference on Pattern Recognition
ISSN (Print)1051-4651


Other2010 20th International Conference on Pattern Recognition, ICPR 2010

ASJC Scopus subject areas

  • Computer Vision and Pattern Recognition


Dive into the research topics of 'Localized multiple kernel regression'. Together they form a unique fingerprint.

Cite this