We apologize for a recent technical issue with our email system, which temporarily affected account activations. Accounts have now been activated. Authors may proceed with paper submissions. PhDFocusTM
CFP last date
20 November 2024
Call for Paper
December Edition
IJCA solicits high quality original research papers for the upcoming December edition of the journal. The last date of research paper submission is 20 November 2024

Submit your paper
Know more
Reseach Article

Implementation of Neural Network Back Propagation Training Algorithm on FPGA

by S. L. Pinjare, Arun Kumar M
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 52 - Number 6
Year of Publication: 2012
Authors: S. L. Pinjare, Arun Kumar M
10.5120/8203-1599

S. L. Pinjare, Arun Kumar M . Implementation of Neural Network Back Propagation Training Algorithm on FPGA. International Journal of Computer Applications. 52, 6 ( August 2012), 1-7. DOI=10.5120/8203-1599

@article{ 10.5120/8203-1599,
author = { S. L. Pinjare, Arun Kumar M },
title = { Implementation of Neural Network Back Propagation Training Algorithm on FPGA },
journal = { International Journal of Computer Applications },
issue_date = { August 2012 },
volume = { 52 },
number = { 6 },
month = { August },
year = { 2012 },
issn = { 0975-8887 },
pages = { 1-7 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume52/number6/8203-1599/ },
doi = { 10.5120/8203-1599 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-06T20:51:32.426973+05:30
%A S. L. Pinjare
%A Arun Kumar M
%T Implementation of Neural Network Back Propagation Training Algorithm on FPGA
%J International Journal of Computer Applications
%@ 0975-8887
%V 52
%N 6
%P 1-7
%D 2012
%I Foundation of Computer Science (FCS), NY, USA
Abstract

This work presents the implementation of trainable Artificial Neural Network (ANN) chip, which can be trained to implement certain functions. Usually training of neural networks is done off-line using software tools in the computer system. The neural networks trained off-line are fixed and lack the flexibility of getting trained during usage. In order to overcome this disadvantage, training algorithm can implemented on-chip with the neural network. In this work back propagation algorithm is implemented in its gradient descent form, to train the neural network to function as basic digital gates and also for image compression. The working of back propagation algorithm to train ANN for basic gates and image compression is verified with intensive MATLAB simulations. In order to implement the hardware, verilog coding is done for ANN and training algorithm. The functionality of the verilog RTL is verified by simulations using ModelSim XE III 6. 2c simulator tool. The verilog code is synthesized using Xilinx ISE 10. 1 tool to get the netlist of ANN and training algorithm. Finally the netlist was mapped to FPGA and the hardware functionality was verified using Xilinx Chipscope Pro Analyzer 10. 1 tool. Thus the concept of neural network chip that is trainable on-line is successfully implemented.

References
  1. Martin T. Hagan, Howard B. Demuth, Mark Beale, "Neural Network Design", China Machine Press, 2002.
  2. Himavathi, Anitha, Muthuramalingam, "Feed forward Neural Network Implementation in FPGA Using Layer Multiplexing for Effective Resource Utilization", Neural Networks, IEEE Transactions-2007.
  3. Rafael Gadea, Franciso Ballester, Antonio Mocholí, Joaquín Cerdá, "Artificial neural network implementation on a single FPGA of a pipelined on-line backpropagation" ISSS Proceedings of the 13th international symposium on System synthesis, IEEE Computer Society 2000.
  4. M. Hajek, "Neural Networks", 2005.
  5. R. Rojas, "Neural Networks", Springer-Verlag, Berlin, 1996,.
  6. Thiang, Handry Khoswanto, Rendy Pangaldus, "Artificial Neural Network with Steepest Descent Backpropagation Training Algorithm for Modeling Inverse Kinematics of Manipulator", World Academy of Sciences, Engineering and Technology, 2009.
  7. B. Verma, M. Blumenstein and S. Kulkarni, "A Neural Network based Technique for Data Compression", Griffith University, Gold Coast Campus, Australia.
  8. Venkata Rama Prasad Vaddella, Kurupati Rama, "Artificial Neural Networks For Compression Of Digital Images: A Review", International Journal of Reviews in Computing, 2009-2010.
  9. Rafid Ahmed Khalil, "Digital Image Compression Enhancement Using Bipolar Backpropagation Neural Networks", University of Mosul, Iraq, 2006.
  10. S. Anna Durai, and E. Anna Saro,"Image Compression with Back Propagation Neural Network using Cumulative Distribution Function" World Academy of Sciences, Engineering and Technology, 2006.
  11. Omaima N. A. AL-Allaf, "Improving the Performance of Backpropagation Neural Network Algorithm for Image Compression/Decompression System", Journal of Computer Science 6, 2010.
Index Terms

Computer Science
Information Sciences

Keywords

Artificial Neural Network Back Propagation On-line Training FPGA Image Compression