Online matrix prediction for sparse loss matrices

Ken Ichiro Moridomi, kohei hatano, Eiji Takimoto, Koji Tsuda

Research output: Contribution to journalConference article

Abstract

We consider an online matrix prediction problem. FTRL is a standard method to deal with online prediction tasks, which makes predictions by minimizing the cumulative loss function and the regularizer function. There are three popular regularizer functions for matrices, Frobenius norm, negative entropy and log-determinant. We propose an FTRL based algorithm with log-determinant as the regularizer and show a regret bound of the algorithm. Our main contribution is to show that the log-determinant regularization is effective when loss matrices are sparse. We also show that our algorithm is optimal for the online collaborative filtering problem with the log-determinant regularization.

Original languageEnglish
Pages (from-to)250-265
Number of pages16
JournalJournal of Machine Learning Research
Volume39
Issue number2014
Publication statusPublished - Jan 1 2014
Event6th Asian Conference on Machine Learning, ACML 2014 - Nha Trang, Viet Nam
Duration: Nov 26 2014Nov 28 2014

Fingerprint

Determinant
Prediction
Regularization
Collaborative filtering
Frobenius norm
Regret
Collaborative Filtering
Entropy
Loss Function

All Science Journal Classification (ASJC) codes

  • Software
  • Control and Systems Engineering
  • Statistics and Probability
  • Artificial Intelligence

Cite this

Online matrix prediction for sparse loss matrices. / Moridomi, Ken Ichiro; hatano, kohei; Takimoto, Eiji; Tsuda, Koji.

In: Journal of Machine Learning Research, Vol. 39, No. 2014, 01.01.2014, p. 250-265.

Research output: Contribution to journalConference article

Moridomi, Ken Ichiro ; hatano, kohei ; Takimoto, Eiji ; Tsuda, Koji. / Online matrix prediction for sparse loss matrices. In: Journal of Machine Learning Research. 2014 ; Vol. 39, No. 2014. pp. 250-265.
@article{45def6059b564c5b9de4e72eb2e4370c,
title = "Online matrix prediction for sparse loss matrices",
abstract = "We consider an online matrix prediction problem. FTRL is a standard method to deal with online prediction tasks, which makes predictions by minimizing the cumulative loss function and the regularizer function. There are three popular regularizer functions for matrices, Frobenius norm, negative entropy and log-determinant. We propose an FTRL based algorithm with log-determinant as the regularizer and show a regret bound of the algorithm. Our main contribution is to show that the log-determinant regularization is effective when loss matrices are sparse. We also show that our algorithm is optimal for the online collaborative filtering problem with the log-determinant regularization.",
author = "Moridomi, {Ken Ichiro} and kohei hatano and Eiji Takimoto and Koji Tsuda",
year = "2014",
month = "1",
day = "1",
language = "English",
volume = "39",
pages = "250--265",
journal = "Journal of Machine Learning Research",
issn = "1532-4435",
publisher = "Microtome Publishing",
number = "2014",

}

TY - JOUR

T1 - Online matrix prediction for sparse loss matrices

AU - Moridomi, Ken Ichiro

AU - hatano, kohei

AU - Takimoto, Eiji

AU - Tsuda, Koji

PY - 2014/1/1

Y1 - 2014/1/1

N2 - We consider an online matrix prediction problem. FTRL is a standard method to deal with online prediction tasks, which makes predictions by minimizing the cumulative loss function and the regularizer function. There are three popular regularizer functions for matrices, Frobenius norm, negative entropy and log-determinant. We propose an FTRL based algorithm with log-determinant as the regularizer and show a regret bound of the algorithm. Our main contribution is to show that the log-determinant regularization is effective when loss matrices are sparse. We also show that our algorithm is optimal for the online collaborative filtering problem with the log-determinant regularization.

AB - We consider an online matrix prediction problem. FTRL is a standard method to deal with online prediction tasks, which makes predictions by minimizing the cumulative loss function and the regularizer function. There are three popular regularizer functions for matrices, Frobenius norm, negative entropy and log-determinant. We propose an FTRL based algorithm with log-determinant as the regularizer and show a regret bound of the algorithm. Our main contribution is to show that the log-determinant regularization is effective when loss matrices are sparse. We also show that our algorithm is optimal for the online collaborative filtering problem with the log-determinant regularization.

UR - http://www.scopus.com/inward/record.url?scp=84984678114&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84984678114&partnerID=8YFLogxK

M3 - Conference article

AN - SCOPUS:84984678114

VL - 39

SP - 250

EP - 265

JO - Journal of Machine Learning Research

JF - Journal of Machine Learning Research

SN - 1532-4435

IS - 2014

ER -