Classification and comparison via neural networks

İlkay Yıldız, Peng Tian, Jennifer Dy, Deniz Erdoğmuş, James Brown, Jayashree Kalpathy-Cramer, Susan Ostmo, John Campbell, Michael Chiang, Stratis Ioannidis

Research output: Contribution to journalArticle

Abstract

We consider learning from comparison labels generated as follows: given two samples in a dataset, a labeler produces a label indicating their relative order. Such comparison labels scale quadratically with the dataset size; most importantly, in practice, they often exhibit lower variance compared to class labels. We propose a new neural network architecture based on siamese networks to incorporate both class and comparison labels in the same training pipeline, using Bradley–Terry and Thurstone loss functions. Our architecture leads to a significant improvement in predicting both class and comparison labels, increasing classification AUC by as much as 35% and comparison AUC by as much as 6% on several real-life datasets. We further show that, by incorporating comparisons, training from few samples becomes possible: a deep neural network of 5.9 million parameters trained on 80 images attains a 0.92 AUC when incorporating comparisons.

Original languageEnglish (US)
Pages (from-to)65-80
Number of pages16
JournalNeural Networks
Volume118
DOIs
StatePublished - Oct 1 2019

Fingerprint

Area Under Curve
Labels
Neural networks
Learning
Network architecture
Datasets
Pipelines

Keywords

  • Classification
  • Comparison
  • Joint learning
  • Neural network
  • Siamese network

ASJC Scopus subject areas

  • Cognitive Neuroscience
  • Artificial Intelligence

Cite this

Yıldız, İ., Tian, P., Dy, J., Erdoğmuş, D., Brown, J., Kalpathy-Cramer, J., ... Ioannidis, S. (2019). Classification and comparison via neural networks. Neural Networks, 118, 65-80. https://doi.org/10.1016/j.neunet.2019.06.004

Classification and comparison via neural networks. / Yıldız, İlkay; Tian, Peng; Dy, Jennifer; Erdoğmuş, Deniz; Brown, James; Kalpathy-Cramer, Jayashree; Ostmo, Susan; Campbell, John; Chiang, Michael; Ioannidis, Stratis.

In: Neural Networks, Vol. 118, 01.10.2019, p. 65-80.

Research output: Contribution to journalArticle

Yıldız, İ, Tian, P, Dy, J, Erdoğmuş, D, Brown, J, Kalpathy-Cramer, J, Ostmo, S, Campbell, J, Chiang, M & Ioannidis, S 2019, 'Classification and comparison via neural networks', Neural Networks, vol. 118, pp. 65-80. https://doi.org/10.1016/j.neunet.2019.06.004
Yıldız İ, Tian P, Dy J, Erdoğmuş D, Brown J, Kalpathy-Cramer J et al. Classification and comparison via neural networks. Neural Networks. 2019 Oct 1;118:65-80. https://doi.org/10.1016/j.neunet.2019.06.004
Yıldız, İlkay ; Tian, Peng ; Dy, Jennifer ; Erdoğmuş, Deniz ; Brown, James ; Kalpathy-Cramer, Jayashree ; Ostmo, Susan ; Campbell, John ; Chiang, Michael ; Ioannidis, Stratis. / Classification and comparison via neural networks. In: Neural Networks. 2019 ; Vol. 118. pp. 65-80.
@article{f171d199e70340908737f8507cd04892,
title = "Classification and comparison via neural networks",
abstract = "We consider learning from comparison labels generated as follows: given two samples in a dataset, a labeler produces a label indicating their relative order. Such comparison labels scale quadratically with the dataset size; most importantly, in practice, they often exhibit lower variance compared to class labels. We propose a new neural network architecture based on siamese networks to incorporate both class and comparison labels in the same training pipeline, using Bradley–Terry and Thurstone loss functions. Our architecture leads to a significant improvement in predicting both class and comparison labels, increasing classification AUC by as much as 35{\%} and comparison AUC by as much as 6{\%} on several real-life datasets. We further show that, by incorporating comparisons, training from few samples becomes possible: a deep neural network of 5.9 million parameters trained on 80 images attains a 0.92 AUC when incorporating comparisons.",
keywords = "Classification, Comparison, Joint learning, Neural network, Siamese network",
author = "İlkay Yıldız and Peng Tian and Jennifer Dy and Deniz Erdoğmuş and James Brown and Jayashree Kalpathy-Cramer and Susan Ostmo and John Campbell and Michael Chiang and Stratis Ioannidis",
year = "2019",
month = "10",
day = "1",
doi = "10.1016/j.neunet.2019.06.004",
language = "English (US)",
volume = "118",
pages = "65--80",
journal = "Neural Networks",
issn = "0893-6080",
publisher = "Elsevier Limited",

}

TY - JOUR

T1 - Classification and comparison via neural networks

AU - Yıldız, İlkay

AU - Tian, Peng

AU - Dy, Jennifer

AU - Erdoğmuş, Deniz

AU - Brown, James

AU - Kalpathy-Cramer, Jayashree

AU - Ostmo, Susan

AU - Campbell, John

AU - Chiang, Michael

AU - Ioannidis, Stratis

PY - 2019/10/1

Y1 - 2019/10/1

N2 - We consider learning from comparison labels generated as follows: given two samples in a dataset, a labeler produces a label indicating their relative order. Such comparison labels scale quadratically with the dataset size; most importantly, in practice, they often exhibit lower variance compared to class labels. We propose a new neural network architecture based on siamese networks to incorporate both class and comparison labels in the same training pipeline, using Bradley–Terry and Thurstone loss functions. Our architecture leads to a significant improvement in predicting both class and comparison labels, increasing classification AUC by as much as 35% and comparison AUC by as much as 6% on several real-life datasets. We further show that, by incorporating comparisons, training from few samples becomes possible: a deep neural network of 5.9 million parameters trained on 80 images attains a 0.92 AUC when incorporating comparisons.

AB - We consider learning from comparison labels generated as follows: given two samples in a dataset, a labeler produces a label indicating their relative order. Such comparison labels scale quadratically with the dataset size; most importantly, in practice, they often exhibit lower variance compared to class labels. We propose a new neural network architecture based on siamese networks to incorporate both class and comparison labels in the same training pipeline, using Bradley–Terry and Thurstone loss functions. Our architecture leads to a significant improvement in predicting both class and comparison labels, increasing classification AUC by as much as 35% and comparison AUC by as much as 6% on several real-life datasets. We further show that, by incorporating comparisons, training from few samples becomes possible: a deep neural network of 5.9 million parameters trained on 80 images attains a 0.92 AUC when incorporating comparisons.

KW - Classification

KW - Comparison

KW - Joint learning

KW - Neural network

KW - Siamese network

UR - http://www.scopus.com/inward/record.url?scp=85067837520&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85067837520&partnerID=8YFLogxK

U2 - 10.1016/j.neunet.2019.06.004

DO - 10.1016/j.neunet.2019.06.004

M3 - Article

VL - 118

SP - 65

EP - 80

JO - Neural Networks

JF - Neural Networks

SN - 0893-6080

ER -