Spectral Ranking Regression

Ilkay Ylldlz, Jennifer Dy, Deniz Erdoǧmuş, Susan Ostmo, J. Peter Campbell, Michael F. Chiang, Stratis Ioannidis

Research output: Contribution to journalArticlepeer-review

Abstract

We study the problem of ranking regression, in which a dataset of rankings is used to learn Plackett-Luce scores as functions of sample features. We propose a novel spectral algorithm to accelerate learning in ranking regression. Our main technical contribution is to show that the Plackett-Luce negative log-likelihood augmented with a proximal penalty has stationary points that satisfy the balance equations of a Markov Chain. This allows us to tackle the ranking regression problem via an efficient spectral algorithm by using the Alternating Directions Method of Multipliers (ADMM). ADMM separates the learning of scores and model parameters, and in turn, enables us to devise fast spectral algorithms for ranking regression via both shallow and deep neural network (DNN) models. For shallow models, our algorithms are up to 579 times faster than the Newton's method. For DNN models, we extend the standard ADMM via a Kullback-Leibler proximal penalty and show that this is still amenable to fast inference via a spectral approach. Compared to a state-of-the-art siamese network, our resulting algorithms are up to 175 times faster and attain better predictions by up to 26% Top-1 Accuracy and 6% Kendall-Tau correlation over five real-life ranking datasets.

Original languageEnglish (US)
Article number120
JournalACM Transactions on Knowledge Discovery from Data
Volume16
Issue number6
DOIs
StatePublished - Jul 30 2022

Keywords

  • ADMM
  • Markov Chain
  • Plackett-Luce
  • ranking
  • spectral methods

ASJC Scopus subject areas

  • General Computer Science

Fingerprint

Dive into the research topics of 'Spectral Ranking Regression'. Together they form a unique fingerprint.

Cite this