Online matrix prediction for sparse loss matrices

Ken Ichiro Moridomi, kohei hatano, Eiji Takimoto, Koji Tsuda

研究成果: ジャーナルへの寄稿Conference article

抄録

We consider an online matrix prediction problem. FTRL is a standard method to deal with online prediction tasks, which makes predictions by minimizing the cumulative loss function and the regularizer function. There are three popular regularizer functions for matrices, Frobenius norm, negative entropy and log-determinant. We propose an FTRL based algorithm with log-determinant as the regularizer and show a regret bound of the algorithm. Our main contribution is to show that the log-determinant regularization is effective when loss matrices are sparse. We also show that our algorithm is optimal for the online collaborative filtering problem with the log-determinant regularization.

元の言語英語
ページ(範囲)250-265
ページ数16
ジャーナルJournal of Machine Learning Research
39
発行部数2014
出版物ステータス出版済み - 1 1 2014
イベント6th Asian Conference on Machine Learning, ACML 2014 - Nha Trang, ベトナム
継続期間: 11 26 201411 28 2014

Fingerprint

Determinant
Prediction
Regularization
Collaborative filtering
Frobenius norm
Regret
Collaborative Filtering
Entropy
Loss Function

All Science Journal Classification (ASJC) codes

  • Software
  • Control and Systems Engineering
  • Statistics and Probability
  • Artificial Intelligence

これを引用

Online matrix prediction for sparse loss matrices. / Moridomi, Ken Ichiro; hatano, kohei; Takimoto, Eiji; Tsuda, Koji.

:: Journal of Machine Learning Research, 巻 39, 番号 2014, 01.01.2014, p. 250-265.

研究成果: ジャーナルへの寄稿Conference article

Moridomi, Ken Ichiro ; hatano, kohei ; Takimoto, Eiji ; Tsuda, Koji. / Online matrix prediction for sparse loss matrices. :: Journal of Machine Learning Research. 2014 ; 巻 39, 番号 2014. pp. 250-265.
@article{45def6059b564c5b9de4e72eb2e4370c,
title = "Online matrix prediction for sparse loss matrices",
abstract = "We consider an online matrix prediction problem. FTRL is a standard method to deal with online prediction tasks, which makes predictions by minimizing the cumulative loss function and the regularizer function. There are three popular regularizer functions for matrices, Frobenius norm, negative entropy and log-determinant. We propose an FTRL based algorithm with log-determinant as the regularizer and show a regret bound of the algorithm. Our main contribution is to show that the log-determinant regularization is effective when loss matrices are sparse. We also show that our algorithm is optimal for the online collaborative filtering problem with the log-determinant regularization.",
author = "Moridomi, {Ken Ichiro} and kohei hatano and Eiji Takimoto and Koji Tsuda",
year = "2014",
month = "1",
day = "1",
language = "English",
volume = "39",
pages = "250--265",
journal = "Journal of Machine Learning Research",
issn = "1532-4435",
publisher = "Microtome Publishing",
number = "2014",

}

TY - JOUR

T1 - Online matrix prediction for sparse loss matrices

AU - Moridomi, Ken Ichiro

AU - hatano, kohei

AU - Takimoto, Eiji

AU - Tsuda, Koji

PY - 2014/1/1

Y1 - 2014/1/1

N2 - We consider an online matrix prediction problem. FTRL is a standard method to deal with online prediction tasks, which makes predictions by minimizing the cumulative loss function and the regularizer function. There are three popular regularizer functions for matrices, Frobenius norm, negative entropy and log-determinant. We propose an FTRL based algorithm with log-determinant as the regularizer and show a regret bound of the algorithm. Our main contribution is to show that the log-determinant regularization is effective when loss matrices are sparse. We also show that our algorithm is optimal for the online collaborative filtering problem with the log-determinant regularization.

AB - We consider an online matrix prediction problem. FTRL is a standard method to deal with online prediction tasks, which makes predictions by minimizing the cumulative loss function and the regularizer function. There are three popular regularizer functions for matrices, Frobenius norm, negative entropy and log-determinant. We propose an FTRL based algorithm with log-determinant as the regularizer and show a regret bound of the algorithm. Our main contribution is to show that the log-determinant regularization is effective when loss matrices are sparse. We also show that our algorithm is optimal for the online collaborative filtering problem with the log-determinant regularization.

UR - http://www.scopus.com/inward/record.url?scp=84984678114&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84984678114&partnerID=8YFLogxK

M3 - Conference article

AN - SCOPUS:84984678114

VL - 39

SP - 250

EP - 265

JO - Journal of Machine Learning Research

JF - Journal of Machine Learning Research

SN - 1532-4435

IS - 2014

ER -