We apologize for a recent technical issue with our email system, which temporarily affected account activations. Accounts have now been activated. Authors may proceed with paper submissions. PhDFocusTM
CFP last date
20 December 2024
Reseach Article

Deep Learning as a Frontier of Machine Learning: A Review

by Vaibhav Kumar, M. L. Garg
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 182 - Number 1
Year of Publication: 2018
Authors: Vaibhav Kumar, M. L. Garg
10.5120/ijca2018917433

Vaibhav Kumar, M. L. Garg . Deep Learning as a Frontier of Machine Learning: A Review. International Journal of Computer Applications. 182, 1 ( Jul 2018), 22-30. DOI=10.5120/ijca2018917433

@article{ 10.5120/ijca2018917433,
author = { Vaibhav Kumar, M. L. Garg },
title = { Deep Learning as a Frontier of Machine Learning: A Review },
journal = { International Journal of Computer Applications },
issue_date = { Jul 2018 },
volume = { 182 },
number = { 1 },
month = { Jul },
year = { 2018 },
issn = { 0975-8887 },
pages = { 22-30 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume182/number1/29725-2018917433/ },
doi = { 10.5120/ijca2018917433 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-07T01:10:05.233991+05:30
%A Vaibhav Kumar
%A M. L. Garg
%T Deep Learning as a Frontier of Machine Learning: A Review
%J International Journal of Computer Applications
%@ 0975-8887
%V 182
%N 1
%P 22-30
%D 2018
%I Foundation of Computer Science (FCS), NY, USA
Abstract

In recent years, there is a revolution in the applications of machine learning which is because of advancement and introduction of deep learning. With the increased layers of learning and a higher level of abstraction, deep learning models have an advantage over conventional machine learning models. There is one more reason for this advantage that there is a direct learning from the data for all aspects of the model. With the increasing size of data and higher demand to find adequate insights from the data, conventional machine learning models see limitations due to the algorithm they work on. The growth in the size of data has triggered the growth of advance, faster and accurate learning algorithms. To remain ahead in the competition, every organization will definitely use such a model which makes the most accurate prediction. In this paper, we will present a review of popularly used deep learning techniques.

References
  1. . I Goodfellow, Y Bengio, A Courville, 2016, “Deep Learning”, MIT Press, Online.
  2. . L Deng, D Yu, 2014, “Deep Learning: Methods and Applications”, Fundamentals and Trends in Signal Processing, Vol-7, Issue-3, Pages- 197-387.
  3. . R Nisbet, G Miner, K Yale, 2018, “Chapter-19, Deep Learning”, Handbook of Statistical Analysis and Data Mining Applications, 2nd Edition, Academic Press.
  4. . X-W Chen, X Lin, 2014, “Big Data Deep Learning: Challenges and Perspectives”, IEEE Access, Vol-2.
  5. . R D Hoff, Accessed 2018, “Deep Learning: A Breakthrough Technology”, MIT Technology Review, Online.
  6. . M M Najafabadi, F Villanustre, T M Khoshgoftar, N Seliya, R Wald, E Muharemagic, 2015, “Deep learning applications and challenges in big data analytics”, Journal of Big Data, Vol-2, Issue-1, Pages- 1-21.
  7. . B. Yegnarayana, “Artificial Neural Networks”, Prentice-Hall of India, Latest Edition.
  8. . M V Garven, S Bohte, Accessed 2018, “Artificial Neural Networks as Models of Neural Information Processing”, Frontiers Research Topics, Online.
  9. . J J Hopfield, 1988, “Artificial neural networks”, IEEE Circuits and Device Magazine, Vol-4, Issue-5, Pages- 3-10.
  10. . A Abraham, 2005 “Artificial Neural Networks”, Handbook of Measuring System Design, Willey Online Library.
  11. . G Bebis, M Georgiopoulos, 1994, “Feed-forward neural networks”, IEEE Potentials, Vol-13, Issue-4, Pages- 27-31.
  12. . M Buscema, 1998, “Back Propagation Neural Networks”, Substance Use & Misuse, Vol-33, Issue-2, Pages- 233-270.
  13. . S Roy, U Chakraborty, 2013 “Introduction to Soft Computing: Neuro-Fuzzy and Genetic Algorithms”, Pearson Education India, 1st Edition.
  14. . S Haykin, 1998, “Neural Networks: A Comprehensive Foundation” Prentice-Hall, 2nd Edition.
  15. . A K Jain, J Mao, K K Mohiuddin, 1996, “Artificial neural networks: a tutorial”, Computer, Vol-29, Issue-3, Pages- 31-44.
  16. . R Reed, R J Marks, 1999, “Neural Smithing: Supervised Learning in Feedforward Artificial Neural Networks”, A Bradford Book (MIT Press).
  17. . R Kempter, W Gerstner, J L Hemmen, 1999, “Hebbian learning and spiking neurons”, Physical Review E, Vol-59, Issue-4, Pages- 4498-4514.
  18. . H T Ng, W B Goh, K L Low, 1997, “Feature selection, perceptron learning, and a usability case study for text categorization”, Proceedings of the 20th annual international ACM SIGIR conference on Research and development in information retrieval, Philadelphia, Pennsylvania, USA Pages- 67-73.
  19. . P Auer, H Burgsteiner, W Maass, 2018, “A learning rule for very simple universal approximators consisting of a single layer of perceptrons”, Neural Networks, Vol-21, Issue-5, Pages- 786-795.
  20. . H Adeli, S-L Hung, 1994, “Machine learning: neural networks, genetic algorithms, and fuzzy systems”, John Willey & Sons.
  21. . T Hastie, R Tibshirani, J Friedman, 2008, “Unsupervised Learning”, The Elements of Statistical Learning, Springer Series in Statistics, Page- 485-585.
  22. . S Kaski, T Kohonen, 1994, “Winner-take-all networks for physiological models of competitive learning”, Neural Networks, Vol-7, Issue-6, Pages- 973-984.
  23. . Y Bengio, 2009, “Learning Deep Architectures for AI”, Foundations and Trends in Machine Learning, Vol-2, Issue-1, Pages- 1-127.
  24. . J Ngiam, A Khosla, M Kim, J Nam, H Lee, A Ng, 2011, “Multimodal Deep Learning”, Proceedings of 28th International Conference on Machine Learning, WA, USA.
  25. . C Szegedy, A Toshev, D Erhan, 2013, “Deep Neural Networks for Object Detection”, Proceedings of the Advances in Neural Information Processing Systems 26.
  26. . H H Aghdam, E J Heravi, 2017, “Guide to convolutional neural networks: a practical approach to traffic sign detection and classification”, Springer Publication.
  27. . A Krizhevsky, I Sutskever, G E Hinton, 2012, “ImageNet Classification with Deep Convolutional Neural Networks”, Proceedings of Advances in Neural Information Processing Systems 25.
  28. . N Kalchbrenner, E Grefenstette, P Blunsom, 2014, “A Convolutional Neural Network for Modelling Sentences”, Proceedings of the 52nd Annual Meeting of the Association of Computational Linguistics, Baltimore, Maryland, Pages- 655-665.
  29. . Y LeCun, Y Bengio, 1998, “Convolutional networks for images, speech, and time series”, The handbook of brain theory and networks, Pages- 255-258.
  30. . H Zeng, M D Edwards, G Liu, D K Gifford, 2016, “Convolutional neural network architectures for predicting DNA-protein binding”, Bioinformatics, Vol-32, Issue-12, Pages- 121-127.
  31. . D Strigl, K Kofler, S Podlipnig, 2010, “Performance and Scalability of GPU-based Convolutional Neural Networks”, Proceedings 18th Euromicro International Conference on Parallel, Distributed and Network-Based Processing, Pisa, Italy.
  32. . G Hinton, 2009, “Deep Belief Networks”, Scholarpedia, Vol-4, Issue-5.
  33. . G E Hinton, S Osindero, Y W Teh, 2006, “A Fast Learning Algorithm for Deep Belief Nets”, Neural Computation, Vol-18, Issue-7, Pages- 1527-1554.
  34. . R Salakhutdinov, A Mnih, G Hinton, 2007, “Restricted Boltzmann machines for collaborative filtering”, Proceeding of 24th International Conference on Machine Learning, Oregon, USA, Pages- 791-798.
  35. . H Larochelle, Y Bengio, 2008, “Classification using discriminative restricted Boltzmann machines”, Proceeding of 25th International Conference on Machine Learning, Finland, Pages- 536-543.
  36. . L R Medsker, L C Jain, 2001, “Recurrent Neural Networks: Design and Applications”, CRC Press LLC.
  37. . X Li, X Wu, 2015, “Constructing long short-term memory based deep recurrent neural networks for large vocabulary speech recognition”, Proceedings of the IEEE International Conference on Acoustics, Speech and Signal Processing, QLD, Australia.
  38. . M Schuster, K K Paliwal, 1997, “Bidirectional recurrent neural networks”, IEEE Transactions on Signal Processing, VOl-45, Issue-11, Pages- 2673-2681.
  39. . O Irsoy, C Cardie, 2014, “Opinion Mining with Deep Recurrent Neural Networks”, Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 720–728.
  40. . S Hochreiter, J Schmidhuber, 1997, “Long Short-Term Memory”, Neural Computation, Vol-9, Issue-8, Pages- 1735-1780.
  41. . E W Saad, D V Prokhorov, D C Wunsch, 1998, “Comparative study of stock trend prediction using time delay, recurrent and probabilistic neural networks”, IEEE Transactions on Neural Networks, Vol-9, Issue-6, Pages-1456-1470.
  42. . J T Connor, R D Martin, L E Atlas, 1994, “Recurrent neural networks and robust time series prediction”, IEEE Transactions on Neural Networks, Vol-5, Issue-2, Pages- 240-254.
  43. . Q P Hu, M Xie, S H Ng, G Levitin 2007, “Robust recurrent neural network modelling for software fault detection and correction prediction”, Vol-92, Issue-3, Pages- 332-340.
  44. . T G Barbounis, J B Theochairs, M C Alexiadis, P S Dokopoulos, 2006, “Long-term wind speed and power forecasting using local recurrent neural network models”, IEEE Transactions on Energy Conversion, Vol-21, Issue-1, Pages-273-284.
  45. . E Levin, 1990, “A recurrent neural network: Limitations and training”, Neural Networks, Vol-3, Issue-6, Pages- 641-650.
  46. . M Sundermeyer, I Oparin, J-L Gauvain, B Freiberg, R Schluter, H Ney, 2013, “Comparison of feedforward and recurrent neural network language models”, Proceedings of the IEEE International Conference on Acoustics, Speech and Signal Processing, BC, Canada.
Index Terms

Computer Science
Information Sciences

Keywords

Deep Learning Machine Learning Neural Networks.