Abstract
We have developed a method for accurately inferring true labels from labels provided by crowdsourcing workers, with the aid of self-reported confidence judgments in their labels. Although confidence judgments can be useful information for estimating the quality of the provided labels, some workers are overconfident about the quality of their labels while others are underconfident. To address this problem, we extended the Dawid-Skene model and created a probabilistic model that considers the differences among workers in their accuracy of confidence judgments. Results of experiments using actual crowdsourced data showed that incorporating workers' confidence judgments can improve the accuracy of inferred labels.
Original language | English |
---|---|
Title of host publication | Human Computation and Crowdsourcing: Works in Progress and Demonstration Abstracts - An Adjunct to the Proceedings of the 1st AAAI Conference on Human Computation and Crowdsourcing, Technical Report |
Publisher | AI Access Foundation |
Pages | 58-59 |
Number of pages | 2 |
Volume | WS-13-18 |
ISBN (Print) | 9781577356318 |
Publication status | Published - 2013 |
Event | 1st AAAI Conference on Human Computation and Crowdsourcing, HCOMP 2013 - Palm Springs, CA, United States Duration: Nov 6 2013 → Nov 9 2013 |
Other
Other | 1st AAAI Conference on Human Computation and Crowdsourcing, HCOMP 2013 |
---|---|
Country/Territory | United States |
City | Palm Springs, CA |
Period | 11/6/13 → 11/9/13 |
All Science Journal Classification (ASJC) codes
- Engineering(all)