We apologize for a recent technical issue with our email system, which temporarily affected account activations. Accounts have now been activated. Authors may proceed with paper submissions. PhDFocusTM
CFP last date
20 December 2024
Reseach Article

Prediction of Student's Performance based on Incremental Learning

by Pallavi Kulkarni, Roshani Ade
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 99 - Number 14
Year of Publication: 2014
Authors: Pallavi Kulkarni, Roshani Ade
10.5120/17440-8211

Pallavi Kulkarni, Roshani Ade . Prediction of Student's Performance based on Incremental Learning. International Journal of Computer Applications. 99, 14 ( August 2014), 10-16. DOI=10.5120/17440-8211

@article{ 10.5120/17440-8211,
author = { Pallavi Kulkarni, Roshani Ade },
title = { Prediction of Student's Performance based on Incremental Learning },
journal = { International Journal of Computer Applications },
issue_date = { August 2014 },
volume = { 99 },
number = { 14 },
month = { August },
year = { 2014 },
issn = { 0975-8887 },
pages = { 10-16 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume99/number14/17440-8211/ },
doi = { 10.5120/17440-8211 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-06T22:28:11.127417+05:30
%A Pallavi Kulkarni
%A Roshani Ade
%T Prediction of Student's Performance based on Incremental Learning
%J International Journal of Computer Applications
%@ 0975-8887
%V 99
%N 14
%P 10-16
%D 2014
%I Foundation of Computer Science (FCS), NY, USA
Abstract

It is necessary to use Student dataset in order to analyze student's performance for future improvements in study methods and overall curricular. Incremental learning methods are becoming popular nowadays since amount of data and information is rising day by day. There is need to update classifier in order to scale up learning to manage more training data. Incremental learning technique is a way in which data is processed in chunks and the results are merged so as to possess less memory. For this reason, in this paper, four classifiers that can run incrementally: the Naive Bayes, KStar, IBK and Nearest neighbor (KNN) have been compared. It is observed that nearest neighbor algorithm gives better accuracy compared to others if applied on Student Evaluation dataset which has been used.

References
  1. Fong, Simon, Zhicong Luo, and Bee Wah Yap. "Incremental Learning Algorithms for Fast Classification in Data Stream. " In Computational and Business Intelligence (ISCBI), 2013 International Symposium on, pp. 186-190. IEEE, 2013
  2. Smith, Michael R. , and Tony Martinez. "Improving classification accuracy by identifying and removing instances that should be misclassified. " In Neural Networks (IJCNN), The 2011 International Joint Conference on, pp. 2690-2697. IEEE, 2011.
  3. Bunkar, Kamal, U. K. Singh, B. Pandya, and Rajesh Bunkar. "Data mining: Prediction for performance improvement of graduate students using classification. " In Wireless and Optical Communications Networks (WOCN), 2012 Ninth International Conference on, pp. 1-5. IEEE, 201.
  4. Bunkar, Kamal, U. K. Singh, B. Pandya, and Rajesh Bunkar. "Data mining: Prediction for performance improvement of graduate students using classification. " In Wireless and Optical Communications Networks (WOCN), 2012 Ninth International Conference on, pp. 1-5. IEEE, 2012M. Young, The Technical Writer's Handbook. Mill Valley, CA: University Science, 1989.
  5. Bhardwaj, Brijesh Kumar, and Saurabh Pal. "Data Mining: A prediction for performance improvement using classification. " arXiv preprint arXiv:1201. 3418 (2012).
  6. Nguyen N. , Paul J. , and Peter H. , A Comparative Analysis of Techniques for Predicting Academic Performance. In Proceedings of the 37th ASEE/IEEE Frontiers in Education Conference. pp. 7-12, 2007.
  7. J. Han and M. Kamber, "Data Mining: Concepts and Techniques," Morgan Kaufmann, 2000
  8. Delavari N. & Beikzadeh M. R & Shirazi M. R. A. , "A New Model for Using Data Mining in Higher Educational System", in Proceedings of 5th International Conference on Information Technology Based Higher Education and Training (ITHET), Istanbul, Turkey, May 31 to June 2, 2004
  9. John G. Cleary, Leonard E. Trigg: K*: An Instance-based Learner Using an Entropic Distance Measure. In: 12th International Conference on Machine Learning, 108-114, 1995.
  10. Cover, Thomas, and Peter Hart. "Nearest neighbor pattern classification. " Information Theory, IEEE Transactions on 13, no. 1 (1967): 21-27
  11. Martin, Brent. "Instance-based learning: nearest neighbour with generalisation. " PhD diss. , University of Waikato, 1995.
  12. Roy, Sylvain. "Nearest neighbor with generalization. " Christchurch, New Zealand (2002).
  13. Kaushik H. Raviya, Biren Gajjar, "Performance Evaluation of Different Data Mining Classification Algorithm Using WEKA".
  14. Ms S. Vijayarani1, Ms M. Muthulakshmi, Comparative Analysis of Bayes and Lazy Classification Algorithms, International Journal of Advanced Research in Computer and Communication Engineering,Vol. 2, Issue 8, August 2013
  15. George H. John, Pat Langley: Estimating Continuous Distributions in Bayesian Classifiers. In: Eleventh Conference on Uncertainty in Artificial Intelligence, San Mateo, 338-345, 1995
  16. Kotsiantis, Sotiris B. "An incremental ensemble of classifiers. " Artificial Intelligence Review 36, no. 4 (2011): 249-266.
  17. Garg, Bandana. Design and Development of Naive Bayes Classifier. Diss. North Dakota State University, 2013
  18. Karim, Masud, and Rashedur M. Rahman. "Decision Tree and Naïve Bayes Algorithm for Classification and Generation of Actionable Knowledge for Direct Marketing. " Journal of Software Engineering & Applications 6, no. 4 (2013).
  19. D. Aha, D. Kibler (1991). Instance-based learning algorithms. Springer, Machine Learning. M:January 1991, Volume 6, Issue 1, pp 37-66
  20. Kotsiantis, S. , Christos Pierrakeas, and P. Pintelas. "Predicting Student;s 'Performance In Distance Learning Using Machine Learning Techniques. " Applied Artificial Intelligence 18, no. 5 (2004): 411-426.
  21. E. H. Wang and A. Kuh, "A smart algorithm for incremental learning," in Proc. Int. Joint Conf. Neural Netw. , vol. 3, 1992, pp. 121–126
  22. B. Zhang, "An incremental learning algorithm that optimizes networsize and sample size in one trial," in Proc. IEEE Int. Conf. Neural Netw. ,1994, pp. 215–220.
  23. F. S. Osorio and B. Amy, "INSS: A hybrid system for constructive machine learning," Neurocomput. , vol. 28, pp. 191–205, 1999. R. Schapire, "Strength of weak learning," Machine Learn. , vol. 5, pp. 197–227, 1990
  24. Y. Freund and R. Schapire, "A decision theoretic generalization of on-line learning and an application to boosting," Comput. Syst. Sci. , vol. 57, no. 1, pp. 119–139, 1997
  25. R. Schapire,Y. Freund, P. Bartlett, andW. S. Lee, "Boosting the margins: A new explanation for the effectiveness of voting methods," Ann. Stat. , vol. 26, no. 5, pp. 1651–1686, 1998
  26. Learn++: An Incremental Learning Algorithm for Supervised Neural Networks, Robi Polikar, Member, IEEE, Lalita Udpa, Senior Member, IEEE, Satish S. Udpa, Senior Member, IEEE, and Vasant Honavar, IEEE transactions on systems, man, and cybernetics—part c: applications and reviews, vol. 31, no. 4, november 2001
  27. N. Littlestone and M. Warmuth, "Weighted majority algorithm," Inform. Comput. , vol. 108, pp. 212–261, 1994.
  28. Romero, Cristóbal, and Sebastian Ventura. "Educational data mining: A survey from 1995 to 2005. " Expert Systems with Applications 33, no. 1 (2007): 135-146
  29. Arruabarrena, R. , Pe´rez, T. A. , Lo´pez-Cuadrado, J. , & Vadillo, J. G. J. (2002). On evaluating adaptive systems for education. In Adaptive hypermedia (pp. 363–367).
  30. Ingram, A. (1999). Using web server logs in evaluating instructional web sites. Journal of Educational Technology Systems, 28(2), 137–57.
  31. T. Kidera, S. Ozawa, S. Abe, An incremental learning algorithm of ensemble classifier systems, in: Proceedings of the IEEE International Joint Conference on Neural Networks (IJCNN'06), 2006, pp. 3421–3427.
  32. J. Macek, Incremental learning of ensemble classifiers on ECG data, in: Proceedings of the IEEE 18th Symposium on Computer-Based Medical Systems (CBMS'05), Dublin, 2005 pp. 315–320
  33. M. D. Muhlbaier, R. Polikar, An ensemble approach for incremental learning in nonstationary environments, in: 7th International Workshop on Multiple Classifier Systems, Prague, Lecture Notes in Computer Science, vol. 4472, 2007,pp. 490–500
  34. R. Polikar, S. Krause, L. Burd, Ensemble of classifiers based incremental learning with dynamic voting weight update, in: Proceedings of the IEEE International Join Conference on Neural Networks (IJCNN'03), 2003, pp. 2770–2775
  35. Luo, Jianhui, et al. "Model-based prognostic techniques [maintenance applications]. " AUTOTESTCON 2003. IEEE Systems Readiness Technology Conference. Proceedings. IEEE, 2003
  36. Martin, Brent. "Instance-based learning: nearest neighbour with generalisation. " PhD diss. , University of Waikato, 1995.
  37. Sylvain Roy (2002). Nearest Neighbor With Generalization. Christchurch, New Zealand.
  38. Salzberg, Steven. "A nearest hyperrectangle learning method. " Machine learning 6, no. 3 (1991): 251-276.
  39. B. K. Bharadwaj and S. Pal. "Data Mining: A prediction for performance improvement using classification", International Journal of Computer Science and Information Security (IJCSIS), Vol. 9, No. 4, pp. 136-140, 2011
  40. Ramaswami M. , and Bhaskaran R. , CHAID Based Performance Prediction Model in Educational Data Mining, IJCSI International Journal of Computer Science Issues, Vol. 7, Issue 1, No. 1, 2010.
  41. Shannaq, B. , Rafael, Y. and Alexandro, V. (2010) 'Student Relationship in Higher Education Using Data Mining Techniques', Global Journal of Computer Science and Technology, vol. 10, no. 11, pp. 54-59.
  42. Martens, David, Bart Baesens, and Tom Fawcett. "Editorial survey: swarm intelligence for data mining. " Machine Learning 82. 1 (2011): 1-42.
  43. Frank, Eibe, et al. "Weka. " Data Mining and Knowledge Discovery Handbook. Springer US, 2005. 1305-1314.
  44. Rajeswari, P. , and G. Reena. "Analysis of liver disorder using data mining algorithm. " Global Journal of Computer Science and Technology 10, no. 14 (2010).
  45. Norén, G. Niklas, Johan Hopstadius, Andrew Bate, Kristina Star, and I. Ralph Edwards. "Temporal pattern discovery in longitudinal electronic patient records. " Data Mining and Knowledge Discovery 20, no. 3 (2010): 361-387.
Index Terms

Computer Science
Information Sciences

Keywords

Student prediction KStar NNGe IBK Naïve Bayes