Online rank aggregation

研究成果: ジャーナルへの寄稿Conference article

8 引用 (Scopus)

抄録

We consider an online learning framework where the task is to predict a permutation which represents a ranking of n fixed objects. At each trial, the learner incurs a loss defined as Kendall tau distance between the predicted permutation and the true permutation given by the adversary. This setting is quite natural in many situations such as information retrieval and recommendation tasks. We prove a lower bound of the cumulative loss and hardness results. Then, we propose an algorithm for this problem and prove its relative loss bound which shows our algorithm is close to optimal.

元の言語英語
ページ(範囲)539-553
ページ数15
ジャーナルJournal of Machine Learning Research
25
出版物ステータス出版済み - 12 1 2012
イベント4th Asian Conference on Machine Learning, ACML 2012 - Singapore, シンガポール
継続期間: 11 4 201211 6 2012

Fingerprint

Rank Aggregation
Permutation
Agglomeration
Kendall's tau
Online Learning
Information retrieval
Information Retrieval
Hardness
Recommendations
Ranking
Lower bound
Predict

All Science Journal Classification (ASJC) codes

  • Software
  • Control and Systems Engineering
  • Statistics and Probability
  • Artificial Intelligence

これを引用

Online rank aggregation. / Yasutake, Shota; hatano, kohei; Takimoto, Eiji; Takeda, Masayuki.

:: Journal of Machine Learning Research, 巻 25, 01.12.2012, p. 539-553.

研究成果: ジャーナルへの寄稿Conference article

@article{ad4d097933154335aae691940f50fe45,
title = "Online rank aggregation",
abstract = "We consider an online learning framework where the task is to predict a permutation which represents a ranking of n fixed objects. At each trial, the learner incurs a loss defined as Kendall tau distance between the predicted permutation and the true permutation given by the adversary. This setting is quite natural in many situations such as information retrieval and recommendation tasks. We prove a lower bound of the cumulative loss and hardness results. Then, we propose an algorithm for this problem and prove its relative loss bound which shows our algorithm is close to optimal.",
author = "Shota Yasutake and kohei hatano and Eiji Takimoto and Masayuki Takeda",
year = "2012",
month = "12",
day = "1",
language = "English",
volume = "25",
pages = "539--553",
journal = "Journal of Machine Learning Research",
issn = "1532-4435",
publisher = "Microtome Publishing",

}

TY - JOUR

T1 - Online rank aggregation

AU - Yasutake, Shota

AU - hatano, kohei

AU - Takimoto, Eiji

AU - Takeda, Masayuki

PY - 2012/12/1

Y1 - 2012/12/1

N2 - We consider an online learning framework where the task is to predict a permutation which represents a ranking of n fixed objects. At each trial, the learner incurs a loss defined as Kendall tau distance between the predicted permutation and the true permutation given by the adversary. This setting is quite natural in many situations such as information retrieval and recommendation tasks. We prove a lower bound of the cumulative loss and hardness results. Then, we propose an algorithm for this problem and prove its relative loss bound which shows our algorithm is close to optimal.

AB - We consider an online learning framework where the task is to predict a permutation which represents a ranking of n fixed objects. At each trial, the learner incurs a loss defined as Kendall tau distance between the predicted permutation and the true permutation given by the adversary. This setting is quite natural in many situations such as information retrieval and recommendation tasks. We prove a lower bound of the cumulative loss and hardness results. Then, we propose an algorithm for this problem and prove its relative loss bound which shows our algorithm is close to optimal.

UR - http://www.scopus.com/inward/record.url?scp=84876844002&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84876844002&partnerID=8YFLogxK

M3 - Conference article

AN - SCOPUS:84876844002

VL - 25

SP - 539

EP - 553

JO - Journal of Machine Learning Research

JF - Journal of Machine Learning Research

SN - 1532-4435

ER -