International Journal of Computer Applications |
Foundation of Computer Science (FCS), NY, USA |
Volume 59 - Number 14 |
Year of Publication: 2012 |
Authors: Sarika Patil, Shyam Deshmukh |
10.5120/9617-4256 |
Sarika Patil, Shyam Deshmukh . Survey on Task Assignment Techniques in Hadoop. International Journal of Computer Applications. 59, 14 ( December 2012), 15-18. DOI=10.5120/9617-4256
MapReduce is an implementation for processing large scale data parallelly. Actual benefits of MapReduce occur when this framework is implemented in large scale, shared nothing cluster. MapReduce framework abstracts the complexity of running distributed data processing across multiple nodes in cluster. Hadoop is open source implementation of MapReduce framework, which processes the vast amount of data in parallel on large clusters. In Hadoop pluggable scheduler was implemented, because of this several algorithms have been developed till now. This paper presents the different schedulers used for Hadoop.