Dynamically-adaptive Weight in Batch Back Propagation Algorithm via Dynamic Training Rate for Speedup and Accuracy Training

Authors

  • Fatma Susilawati Mohamad
  • Mohammed Sarhan Al Duais

DOI:

https://doi.org/10.26636/jtit.2017.113017

Keywords:

artificial neural network (ANN), batch back propagation algorithm, dynamic training rate, speed up training, accuracy training

Abstract

The main problem of batch back propagation (BBP) algorithm is slow training and there are several parameters need to be adjusted manually, such as learning rate. In addition, the BBP algorithm suffers from saturation training. The objective of this study is to improve the speed up training of the BBP algorithm and to remove the saturation training. The training rate is the most significant parameter for increasing the efficiency of the BBP. In this study, a new dynamic training rate is created to speed the training of the BBP algorithm. The dynamic batch back propagation (DBBPLR) algorithm is presented, which trains with adynamic training rate. This technique was implemented with a sigmoid function. Several data sets were used as benchmarks for testing the effects of the created dynamic training rate that we created. All the experiments were performed on Matlab. From the experimental results, the DBBPLR algorithm provides superior performance in terms of training, faster training with higher accuracy compared to the BBP algorithm and existing works.

Downloads

Download data is not yet available.

Downloads

Published

2017-12-30

Issue

Section

ARTICLES FROM THIS ISSUE

How to Cite

[1]
F. S. Mohamad and M. Sarhan Al Duais, “Dynamically-adaptive Weight in Batch Back Propagation Algorithm via Dynamic Training Rate for Speedup and Accuracy Training”, JTIT, vol. 70, no. 4, pp. 82–89, Dec. 2017, doi: 10.26636/jtit.2017.113017.