Publications -> Conference Papers

Active Crowdsourcing for Annotation


Authors: S. Hao, C. Miao, S. C. H. Hoi, and P. Zhao
Title: Active Crowdsourcing for Annotation
Abstract: Crowdsourcing has shown great potential in obtaining large-scale and cheap labels for different tasks. However, obtaining reliable labels is challenging due to several reasons, such as noisy annotators, limited budget and so on. The state-of-the-art approaches, either suffer in some noisy scenarios, or rely on unlimited resources to acquire reliable labels. In this article, we adopt the learning with expert (AKA worker in crowdsourcing) advice framework to robustly infer accurate labels by considering the reliability of each worker. However, in order to accurately predict the reliability of each worker, traditional learning with expert advice will consult with external oracles (AKA domain experts) on the true label of each instance. To reduce the cost of consultation, we proposed two active learning approaches, margin-based and weighted difference of advices based. Mean-while, to address the problem of limited annotation budget, we proposed a reliability-based assigning approach which actively decides who to annotate the next instance based on each worker’s cumulative performance. The experimental results both on real and simulated datasets show that our algorithms can achieve robust and promising performance both in the normal and noisy scenarios with limited budget.
Keywords: 
Conference Name: 2015 IEEE/WIC/ACM International Joint Conferences on Web Intelligence and Intelligent Agent Technology (WI-IAT'15)
Location: Singapore, Singapore
Publisher: IEEE
Year: 2015
Accepted PDF File: Active_Crowdsourcing_for_Annotation_accepted.pdf
Permanent Link: http://dx.doi.org/10.1109/WI-IAT.2015.34
Reference: S. Hao, C. Miao, S. C. H. Hoi, and P. Zhao, “Active crowdsourcing for annotation,” in Proceedings of the 2015 IEEE/WIC/ACM International Joint Conferences on Web Intelligence and Intelligent Agent Technology (WI-IAT’15). IEEE, December 2015, accepted for publication.
bibtex: 
@inproceedings{LILY-c53, 
    author	= {Hao, Shuji and Miao, Chunyan and Hoi, Steven C. H. and Zhao, Peilin},
    title	= {Active Crowdsourcing for Annotation},  
    booktitle	= {Proceedings of the 2015 IEEE/WIC/ACM International Joint Conferences on Web Intelligence and Intelligent Agent Technology (WI-IAT'15)}, 
    year		= {2015}, 
    month	= {December}, 
    pages	= {1-8}, 
    location	= {Singapore, Singapore},
    publisher	= {IEEE},
 }