Understanding emotional impact of images using Bayesian multiple kernel learning

He Zhang, Mehmet Gonen, Zhirong Yang, Erkki Oja

Research output: Contribution to journalArticle

9 Citations (Scopus)

Abstract

Affective classification and retrieval of multimedia such as audio, image, and video have become emerging research areas in recent years. The previous research focused on designing features and developing feature extraction methods. Generally, a multimedia content can be represented with different feature representations (i.e., views). However, the most suitable feature representation related to people's emotions is usually not known a priori. We propose here a novel Bayesian multiple kernel learning algorithm for affective classification and retrieval tasks. The proposed method can make use of different representations simultaneously (i.e., multiview learning) to obtain a better prediction performance than using a single feature representation (i.e., single-view learning) or a subset of features, with the advantage of automatic feature selections. In particular, our algorithm has been implemented within a multilabel setup to capture the correlation between emotions, and the Bayesian formulation enables our method to produce probabilistic outputs for measuring a set of emotions triggered by a single image. As a case study, we perform classification and retrieval experiments with our algorithm for predicting people's emotional states evoked by images, using generic low-level image features. The empirical results with our approach on the widely-used International Affective Picture System (IAPS) data set outperform several existing methods in terms of classification performance and results interpretability.

Original languageEnglish (US)
Pages (from-to)3-13
Number of pages11
JournalNeurocomputing
Volume165
DOIs
StatePublished - Oct 1 2015
Externally publishedYes

Fingerprint

Learning
Emotions
Multimedia
Feature extraction
Research
Learning algorithms
Experiments
Datasets

Keywords

  • Image emotions
  • Low-level image features
  • Multiple kernel learning
  • Multiview learning
  • Variational approximation

ASJC Scopus subject areas

  • Artificial Intelligence
  • Computer Science Applications
  • Cognitive Neuroscience

Cite this

Understanding emotional impact of images using Bayesian multiple kernel learning. / Zhang, He; Gonen, Mehmet; Yang, Zhirong; Oja, Erkki.

In: Neurocomputing, Vol. 165, 01.10.2015, p. 3-13.

Research output: Contribution to journalArticle

Zhang, He ; Gonen, Mehmet ; Yang, Zhirong ; Oja, Erkki. / Understanding emotional impact of images using Bayesian multiple kernel learning. In: Neurocomputing. 2015 ; Vol. 165. pp. 3-13.
@article{c5a9d43b4782438e91b62359e641fc54,
title = "Understanding emotional impact of images using Bayesian multiple kernel learning",
abstract = "Affective classification and retrieval of multimedia such as audio, image, and video have become emerging research areas in recent years. The previous research focused on designing features and developing feature extraction methods. Generally, a multimedia content can be represented with different feature representations (i.e., views). However, the most suitable feature representation related to people's emotions is usually not known a priori. We propose here a novel Bayesian multiple kernel learning algorithm for affective classification and retrieval tasks. The proposed method can make use of different representations simultaneously (i.e., multiview learning) to obtain a better prediction performance than using a single feature representation (i.e., single-view learning) or a subset of features, with the advantage of automatic feature selections. In particular, our algorithm has been implemented within a multilabel setup to capture the correlation between emotions, and the Bayesian formulation enables our method to produce probabilistic outputs for measuring a set of emotions triggered by a single image. As a case study, we perform classification and retrieval experiments with our algorithm for predicting people's emotional states evoked by images, using generic low-level image features. The empirical results with our approach on the widely-used International Affective Picture System (IAPS) data set outperform several existing methods in terms of classification performance and results interpretability.",
keywords = "Image emotions, Low-level image features, Multiple kernel learning, Multiview learning, Variational approximation",
author = "He Zhang and Mehmet Gonen and Zhirong Yang and Erkki Oja",
year = "2015",
month = "10",
day = "1",
doi = "10.1016/j.neucom.2014.10.093",
language = "English (US)",
volume = "165",
pages = "3--13",
journal = "Neurocomputing",
issn = "0925-2312",
publisher = "Elsevier",

}

TY - JOUR

T1 - Understanding emotional impact of images using Bayesian multiple kernel learning

AU - Zhang, He

AU - Gonen, Mehmet

AU - Yang, Zhirong

AU - Oja, Erkki

PY - 2015/10/1

Y1 - 2015/10/1

N2 - Affective classification and retrieval of multimedia such as audio, image, and video have become emerging research areas in recent years. The previous research focused on designing features and developing feature extraction methods. Generally, a multimedia content can be represented with different feature representations (i.e., views). However, the most suitable feature representation related to people's emotions is usually not known a priori. We propose here a novel Bayesian multiple kernel learning algorithm for affective classification and retrieval tasks. The proposed method can make use of different representations simultaneously (i.e., multiview learning) to obtain a better prediction performance than using a single feature representation (i.e., single-view learning) or a subset of features, with the advantage of automatic feature selections. In particular, our algorithm has been implemented within a multilabel setup to capture the correlation between emotions, and the Bayesian formulation enables our method to produce probabilistic outputs for measuring a set of emotions triggered by a single image. As a case study, we perform classification and retrieval experiments with our algorithm for predicting people's emotional states evoked by images, using generic low-level image features. The empirical results with our approach on the widely-used International Affective Picture System (IAPS) data set outperform several existing methods in terms of classification performance and results interpretability.

AB - Affective classification and retrieval of multimedia such as audio, image, and video have become emerging research areas in recent years. The previous research focused on designing features and developing feature extraction methods. Generally, a multimedia content can be represented with different feature representations (i.e., views). However, the most suitable feature representation related to people's emotions is usually not known a priori. We propose here a novel Bayesian multiple kernel learning algorithm for affective classification and retrieval tasks. The proposed method can make use of different representations simultaneously (i.e., multiview learning) to obtain a better prediction performance than using a single feature representation (i.e., single-view learning) or a subset of features, with the advantage of automatic feature selections. In particular, our algorithm has been implemented within a multilabel setup to capture the correlation between emotions, and the Bayesian formulation enables our method to produce probabilistic outputs for measuring a set of emotions triggered by a single image. As a case study, we perform classification and retrieval experiments with our algorithm for predicting people's emotional states evoked by images, using generic low-level image features. The empirical results with our approach on the widely-used International Affective Picture System (IAPS) data set outperform several existing methods in terms of classification performance and results interpretability.

KW - Image emotions

KW - Low-level image features

KW - Multiple kernel learning

KW - Multiview learning

KW - Variational approximation

UR - http://www.scopus.com/inward/record.url?scp=84929953674&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84929953674&partnerID=8YFLogxK

U2 - 10.1016/j.neucom.2014.10.093

DO - 10.1016/j.neucom.2014.10.093

M3 - Article

AN - SCOPUS:84929953674

VL - 165

SP - 3

EP - 13

JO - Neurocomputing

JF - Neurocomputing

SN - 0925-2312

ER -