We apologize for a recent technical issue with our email system, which temporarily affected account activations. Accounts have now been activated. Authors may proceed with paper submissions. PhDFocusTM
CFP last date
20 December 2024
Reseach Article

An Innovative Crowdsourcing Approach for Amazon Mechanical Turk

by Hanieh Javadi Khasraghi, Shahriar Mohammadi
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 52 - Number 4
Year of Publication: 2012
Authors: Hanieh Javadi Khasraghi, Shahriar Mohammadi
10.5120/8190-1556

Hanieh Javadi Khasraghi, Shahriar Mohammadi . An Innovative Crowdsourcing Approach for Amazon Mechanical Turk. International Journal of Computer Applications. 52, 4 ( August 2012), 20-25. DOI=10.5120/8190-1556

@article{ 10.5120/8190-1556,
author = { Hanieh Javadi Khasraghi, Shahriar Mohammadi },
title = { An Innovative Crowdsourcing Approach for Amazon Mechanical Turk },
journal = { International Journal of Computer Applications },
issue_date = { August 2012 },
volume = { 52 },
number = { 4 },
month = { August },
year = { 2012 },
issn = { 0975-8887 },
pages = { 20-25 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume52/number4/8190-1556/ },
doi = { 10.5120/8190-1556 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-06T20:52:54.875180+05:30
%A Hanieh Javadi Khasraghi
%A Shahriar Mohammadi
%T An Innovative Crowdsourcing Approach for Amazon Mechanical Turk
%J International Journal of Computer Applications
%@ 0975-8887
%V 52
%N 4
%P 20-25
%D 2012
%I Foundation of Computer Science (FCS), NY, USA
Abstract

Web2 and the evolving vision of Web3 have a great effect on facilitation of information sharing, information aggregation, interoperability, user-centered design, collaboration on the World Wide Web, and crowd-centered services. New concept of Web is the intuition that drives crowdsourcing, crowd servicing, and crowd computing. With crowdsourcing emergence people get motivated to work through internet without being limited by time or geographical location. On the other hand employers could have their jobs done faster and cheaper. This paper is going to introduce an innovative approach for Amazon Mechanical Turk (AMT) crowdsourcing marketplace. In current AMT marketplace, workers especially new ones need to qualify themselves for each requester that has submitted Human Intelligence Tasks (HITs) in AMT, and there is lack of shared reputation system; some workers may cheat on tasks in order to maximize their income, as a result requesters are uncertain of the quality of results, so they offer lower rewards and consequently qualified workers leave the marketplace. Because of the above shortcomings, we introduce a new approach for AMT crowdsourcing marketplace. In our proposed approach we offer to distribute HITs among Amazon's customers and ask them to work on tasks in exchange for discount. The distribution of HITs is based on customers' interests and skills that Amazon has this information in its database. Through our proposed approach the HITs will be done by more qualified people, and spammers will be decreased to the minimum. This innovative approach is very efficient, time saving, and user friendly (because workers don't need to search for HITs of their interests).

References
  1. Sheng, V. S. , Provost, F. , and Ipeirotis, P. G. 2008. Get another label? improving data quality and data mining using multiple, noisy labelers. In KDD '08: Proceeding of the 14th ACM SIGKDD international conference on Knowledge discovery and data mining, 614–622. New York, NY, USA: ACM.
  2. Le, J. , Edmonds, A. , Hester, V. and Biewald, L. 2010. Ensuring quality in crowdsourced search relevance evaluation: The effects of training question distribution. In Proceedings of the ACM SIGIR 2010 Workshop on Crowdsourcing for Search Evaluation.
  3. Kittur, A. , Chi, Ed H. , and Suh, B. 2008. Crowdsourcing user studies with Mechanical Turk. In Proceeding of CHI. April 5-10. Florence, Italy. ACM.
  4. Howe, J. 2006. The Rise of Crowdsourcing. Wired.
  5. Davis, J. 2011. From Crowdsourcing to Crowdservicing. IEEE Internet Computing.
  6. Vukovic, M. 2009. Crowdsourcing for Enterprises, in Proceeding of the 2009 Congress on Services – I. IEEE Computer Society, Washington,DC, USA, 686-692.
  7. Ipeirotis, P. 2010. Analyzing the Amazon Mechanical Turk Marketplace.
  8. Ambati, V. , Vogel, S. , Carbonel, J. 2011. Towards Task Recommendation in Micro-Task Markets. AAAI Workshops, North America.
  9. Quinn, A. J. , and Bederson, B. B. 2011. Human Computation: A Survey and Taxonomy of a Growing Field. In proceeding of CHI.
  10. Fort, K. , Adda, G. and Cohen, K. B. 2011. Amazon Mechanical Turk: Gold Mine or Coal Mine? In Association for Computational Linguistics, Volume 37, Number2.
  11. Geiger, D. , Seedorf, S. , Schulze, T. , Nickerson, R. 2011. Managing the Crowds: Towards a Taxonomy of Crowdsourcing Processes, In Processing of the seventeenth Americas Conference on Information Systems.
  12. Pan, Y. , Blevis, E. 2011. A Survey of Crowdsourcing as means of Collaboration and Implications of Crowdsourcing for Interaction Design, IEEE.
  13. Chen, J. , Menezes, N. , Bradley, A. 2011. Opportunities for Crowdsourcing Research on Amazon Mechanical Turk, Amazon. com,Inc.
  14. Paolacci, G. , Chandler, J. , Ipeirotis, P. 2010. Running experiments on Amazon Mechanical Turk, Judgment and Decision Making, Vol. 5, No. 5.
  15. Shaw, A. , Horton, J. , Chen, D. 2011. Designing Incentives for Inexpert Human Raters, CSCW 2011, March 19-23, 2011, Hangzhou, China.
  16. Khanna, Sh. , Ratan, A. , Davis, J. , Thies, W. 2010. Evaluating and Improving the Usability of Mechanical Turk for Low-Income Workers in India, ACM DEV'10, December 17–18, 2010, London, United Kingdom.
  17. Chen, J. , Menezes, N. , and Bradley, A. 2011. Opportunities for Crowdsourcing Research on Amazon Mechanical Turk. In proceeding of CHI 2011 Workshop on Crowdsourcing and Human Computation
  18. Eickhoff, C. , De Vries, A. 2011. How crowdsourcable is your task? In Proceedings of the Workshop on Crowdsourcing for Search and Data Mining (CSDM) at the Fourth ACM International Conference on Web Search and Data Mining (WSDM), pp. 11–14.
  19. Dow, S. , Kulkarni, A. , Bunge, B. , Nguyen, T. , Klemmer, S. and Hartmann, B. 2011. Shepherding the crowd: managing and providing feedback to crowd workers. In Proceedings of the CHI Extended Abstracts on Human factors in computing Systems. New York, NY, USA. ACM. 1669–1674.
  20. Brabham, D. 2008. Crowdsourcing as a model for problem solving. The International Journal of Research into New Media Technologies. Vol 14(1). pp. 75-90.
  21. Ipeirotis, P. G. , Provost, F. , & Wang, J. 2010. Quality management on Amazon Mechanical Turk. HCOMP'10.
  22. Kittur, A. , Kraut, R. E. 2008. Harnessing the wisdom of crowds in wikipedia: quality through coordination, in Proceedings of the ACM Conference on Computer Supported Cooperative Work, San Diego, California, USA.
  23. Zaidan, O. F. Callison-Burch, Ch. 2011. Crowdsourcing Translation: Professional Quality from Non-Professionals. In Proceeding of 49th Annual Meeting of the Association for Computational Linguistics. Portland, Oregan. June 19-24. Association for Computational Linguistics. pp. 1220-1229.
Index Terms

Computer Science
Information Sciences

Keywords

Crowdsourcing Amazon Mechanical Turk (AMT) Human Intelligence Tasks (HITs) Classifying Distributing