CFP last date
20 January 2025
Reseach Article

Forest Cover Type Prediction using Cartographic Variables

by Tejas Anant Wagh, R. Bhargavi, Tanmay Anant Wagh, R. M. Samant
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 182 - Number 30
Year of Publication: 2018
Authors: Tejas Anant Wagh, R. Bhargavi, Tanmay Anant Wagh, R. M. Samant
10.5120/ijca2018918191

Tejas Anant Wagh, R. Bhargavi, Tanmay Anant Wagh, R. M. Samant . Forest Cover Type Prediction using Cartographic Variables. International Journal of Computer Applications. 182, 30 ( Dec 2018), 14-18. DOI=10.5120/ijca2018918191

@article{ 10.5120/ijca2018918191,
author = { Tejas Anant Wagh, R. Bhargavi, Tanmay Anant Wagh, R. M. Samant },
title = { Forest Cover Type Prediction using Cartographic Variables },
journal = { International Journal of Computer Applications },
issue_date = { Dec 2018 },
volume = { 182 },
number = { 30 },
month = { Dec },
year = { 2018 },
issn = { 0975-8887 },
pages = { 14-18 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume182/number30/30217-2018918191/ },
doi = { 10.5120/ijca2018918191 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-07T01:12:53.706929+05:30
%A Tejas Anant Wagh
%A R. Bhargavi
%A Tanmay Anant Wagh
%A R. M. Samant
%T Forest Cover Type Prediction using Cartographic Variables
%J International Journal of Computer Applications
%@ 0975-8887
%V 182
%N 30
%P 14-18
%D 2018
%I Foundation of Computer Science (FCS), NY, USA
Abstract

Information regarding forest land is highly required for developing ecosystem management. This paper provides an analysis related to classification and prediction estimation using machine learning techniques. The approach is to predict the forest cover type using the cartographic variables like aspect, slope, soil type, wilderness area etc. Various Data mining techniques such as decision tress, random forest, regression trees, and gradient boosting machines are used for prediction of the forest cover type. Using these machine learning methods models have been developed and tested for accuracy ranging from 19.4% to 74.8%. Kaggle dataset which is the standard benchmarking dataset, is taken for comparison studies. The comparisons of these models are done to identify a better model for predicting the forest cover type with better accuracy. For performance comparison, metrics like accuracy and error rate are used. An important aspect of the study is the use of different performance measures to evaluate the learning methods.

References
  1. D.J. Newman A. Asuncion. UCI machine learning repository, 2007.
  2. Jiawei Han and Micheline Kamber, “Data Mining Concepts and Techniques”,second edition Morgan Kaufmann publisher.
  3. Ya Su, Xinbo Gao, Xuelong Li,and Dacheng Tao. “Multivariate Multilinear Regression”,IEEE transactions on systems, man and cybernetics-Part B: cybernetics, vol 42.No.42 .
  4. Decision Trees for Business Intelligence and Data Mining: Using SAS Enterprise Miner.
  5. Simon Bernard, Laurent Heutte, Sebastian Adam. ”On the selection of decision trees in Random Forests”. International Joint Conference on Neural Networks IEEE, Jun 2009, France.
  6. Myungsook Klassen,” Learning microarray cancer datasets by random forests and support vector machines”,IEEE, 2010.
  7. Mohammed S. Alam and Son T. Vuong, “Random Forest Classification for Detecting Android Malware”, 2013 IEEE International Conference on Green Computing and Communications and IEEE Internet of Things and IEEE Cyber, Physical and Social Computing.
  8. Vrushali Y Kulkarni and Dr Pradeep K Sinha, ” Random Forest Classifiers :A Survey and Future Research Directions”, International Journal of Advanced Computing, ISSN:2051-0845, Vol.36, Issue.1.
  9. Yasser Ganjisaffar, Rich Caruana, Cristina Videira Lopes, “Bagging Gradient-Boosted Trees for High Precision, Low Variance Ranking Models”, ACM SIGIR’11, July 24–28, 2011, Beijing, China.
  10. Chun-Xia Zhang, Jiang-She Zhang, Gai-Ying Zhang, ” An efficient modified boosting method for solving classification problems”, Science Direct, An efficient modified boosting method for solving classification problems.
  11. Chun-Xia Zhang, Jiang-She Zhang, “A local boosting algorithm for solving classification problems”, Science Direct, Computational Statistics & Data Analysis 52 (2008) 1928 – 1941.
  12. Uyen Nguyen Thi Van, and Tae Choong Chung, “An Efficient Decision Tree Construction for Large Datasets”, IEEE journal 2008.
  13. Christopher M. Bishop. Pattern Recognition and Machine Learning (Information Science and Statistics).Springer-Verlag New York, Inc., Secaucus, NJ, USA,2006.1
  14. Thangaparvathi.B and Anandhavalli.D, “An Improved Algorithm of Decision Tree for Classifying Large Data Set Based on RainForest Framework”,IEEE journal 2010.
Index Terms

Computer Science
Information Sciences

Keywords

Machine learning classification and regression decision trees random forest gradient boosting machines.