New Search

Export article
Open Access

Optimasi Learning Rate Neural Network Backpropagation Dengan Search Direction Conjugate Gradient Pada Electrocardiogram

Azwar Riza Habibi, Vivi Aida Fitria, Lukman Hakim
NUMERICAL: Jurnal Matematika dan Pendidikan Matematika pp 131-137; doi:10.25217/numerical.v3i2.603

Abstract: This paper develops a Neural network (NN) using conjugate gradient (CG). The modification of this method is in defining the direction of linear search. The conjugate gradient method has several methods to determine the steep size such as the Fletcher-Reeves, Dixon, Polak-Ribere, Hestene Steifel, and Dai-Yuan methods by using discrete electrocardiogram data. Conjugate gradients are used to update learning rates on neural networks by using different steep sizes. While the gradient search direction is used to update the weight on the NN. The results show that using Polak-Ribere get an optimal error, but the direction of the weighting search on NN widens and causes epoch on NN training is getting longer. But Hestene Steifel, and Dai-Yua could not find the gradient search direction so they could not update the weights and cause errors and epochs to infinity.
Keywords: neural / electrocardiogram / conjugate gradient / optimal / Steep / Search Direction / hestene

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

Share this article

Click here to see the statistics on "NUMERICAL: Jurnal Matematika dan Pendidikan Matematika" .
Back to Top Top