International Journal of Computer Applications |
Foundation of Computer Science (FCS), NY, USA |
Volume 167 - Number 9 |
Year of Publication: 2017 |
Authors: Himani Saraswat, Neeta Sharma, Abhishek Rai |
10.5120/ijca2017914367 |
Himani Saraswat, Neeta Sharma, Abhishek Rai . Enhancing the Traditional File System to HDFS: A Big Data Solution. International Journal of Computer Applications. 167, 9 ( Jun 2017), 12-14. DOI=10.5120/ijca2017914367
We are in the twenty-first centuries also known as the digital era, where each and every thing generates a data whether it’s a mobile phone, signals, day to day purchasing and many more. This rapidly increases in amount of data; Big data has become a current and future frontier for researchers. In big data analysis, the computation is done on massive heap of data sets to extract intelligent, knowledgeable and meaningful data and at the same time the storage is also readily available to support the concurrent computation process. The Hadoop is designed to meet these complex but meaningful work. The HDFS (Hadoop Distributed File System) is highly fault-safe and is designed to be deployed on low cost hardware. This paper gives out the benefits of HDFS given to the large data set; HDFS architecture and its role in Hadoop.