Cite Article

Second Order Learning Algorithm for Back Propagation Neural Networks

Choose citation format

BibTeX

@article{IJASEIT1956,
   author = {Nazri Mohd Nawi and Noorhamreeza Abdul Hamid and Noor Azah Samsudin and Mohd Amin Mohd Yunus and Mohd Firdaus Ab Aziz},
   title = {Second Order Learning Algorithm for Back Propagation Neural Networks},
   journal = {International Journal on Advanced Science, Engineering and Information Technology},
   volume = {7},
   number = {4},
   year = {2017},
   pages = {1162--1171},
   keywords = {Back propagation algorithm; gradient descent; activation function; second order method; search direction;},
   abstract = {

Training of artificial neural networks (ANN) is normally a time consuming task due to iteratively search imposed by implicit nonlinearity of the network behavior.  In this work an improvement to ‘batch-mode’ offline training methods, gradient based or gradient free is proposed. The new procedure computes and improves the search direction along the negative gradient by introducing the ‘gain’ value of the activation functions and calculating the negative gradient on error with respect to the weights as well as ‘gain’ values in minimizing the error function. The main advantage of this new procedure is that it is easy to implement into other faster optimization algorithms such as conjugate gradient method and Quasi-Newton method. The pperformance of the proposed method implemented into conjugate gradient method and Quasi-Newton method is demonstrated by comparing the simulation results to the neural network toolbox for the chosen benchmark. The results show that the proposed method considerably improves the convergence rate significantly faster the learning process of the general back propagation algorithm because of it new efficient search direction.

},    issn = {2088-5334},    publisher = {INSIGHT - Indonesian Society for Knowledge and Human Development},    url = {http://ijaseit.insightsociety.org/index.php?option=com_content&view=article&id=9&Itemid=1&article_id=1956},    doi = {10.18517/ijaseit.7.4.1956} }

EndNote

%A Nawi, Nazri Mohd
%A Hamid, Noorhamreeza Abdul
%A Samsudin, Noor Azah
%A Mohd Yunus, Mohd Amin
%A Ab Aziz, Mohd Firdaus
%D 2017
%T Second Order Learning Algorithm for Back Propagation Neural Networks
%B 2017
%9 Back propagation algorithm; gradient descent; activation function; second order method; search direction;
%! Second Order Learning Algorithm for Back Propagation Neural Networks
%K Back propagation algorithm; gradient descent; activation function; second order method; search direction;
%X 

Training of artificial neural networks (ANN) is normally a time consuming task due to iteratively search imposed by implicit nonlinearity of the network behavior.  In this work an improvement to ‘batch-mode’ offline training methods, gradient based or gradient free is proposed. The new procedure computes and improves the search direction along the negative gradient by introducing the ‘gain’ value of the activation functions and calculating the negative gradient on error with respect to the weights as well as ‘gain’ values in minimizing the error function. The main advantage of this new procedure is that it is easy to implement into other faster optimization algorithms such as conjugate gradient method and Quasi-Newton method. The pperformance of the proposed method implemented into conjugate gradient method and Quasi-Newton method is demonstrated by comparing the simulation results to the neural network toolbox for the chosen benchmark. The results show that the proposed method considerably improves the convergence rate significantly faster the learning process of the general back propagation algorithm because of it new efficient search direction.

%U http://ijaseit.insightsociety.org/index.php?option=com_content&view=article&id=9&Itemid=1&article_id=1956 %R doi:10.18517/ijaseit.7.4.1956 %J International Journal on Advanced Science, Engineering and Information Technology %V 7 %N 4 %@ 2088-5334

IEEE

Nazri Mohd Nawi,Noorhamreeza Abdul Hamid,Noor Azah Samsudin,Mohd Amin Mohd Yunus and Mohd Firdaus Ab Aziz,"Second Order Learning Algorithm for Back Propagation Neural Networks," International Journal on Advanced Science, Engineering and Information Technology, vol. 7, no. 4, pp. 1162-1171, 2017. [Online]. Available: http://dx.doi.org/10.18517/ijaseit.7.4.1956.

RefMan/ProCite (RIS)

TY  - JOUR
AU  - Nawi, Nazri Mohd
AU  - Hamid, Noorhamreeza Abdul
AU  - Samsudin, Noor Azah
AU  - Mohd Yunus, Mohd Amin
AU  - Ab Aziz, Mohd Firdaus
PY  - 2017
TI  - Second Order Learning Algorithm for Back Propagation Neural Networks
JF  - International Journal on Advanced Science, Engineering and Information Technology; Vol. 7 (2017) No. 4
Y2  - 2017
SP  - 1162
EP  - 1171
SN  - 2088-5334
PB  - INSIGHT - Indonesian Society for Knowledge and Human Development
KW  - Back propagation algorithm; gradient descent; activation function; second order method; search direction;
N2  - 

Training of artificial neural networks (ANN) is normally a time consuming task due to iteratively search imposed by implicit nonlinearity of the network behavior.  In this work an improvement to ‘batch-mode’ offline training methods, gradient based or gradient free is proposed. The new procedure computes and improves the search direction along the negative gradient by introducing the ‘gain’ value of the activation functions and calculating the negative gradient on error with respect to the weights as well as ‘gain’ values in minimizing the error function. The main advantage of this new procedure is that it is easy to implement into other faster optimization algorithms such as conjugate gradient method and Quasi-Newton method. The pperformance of the proposed method implemented into conjugate gradient method and Quasi-Newton method is demonstrated by comparing the simulation results to the neural network toolbox for the chosen benchmark. The results show that the proposed method considerably improves the convergence rate significantly faster the learning process of the general back propagation algorithm because of it new efficient search direction.

UR - http://ijaseit.insightsociety.org/index.php?option=com_content&view=article&id=9&Itemid=1&article_id=1956 DO - 10.18517/ijaseit.7.4.1956

RefWorks

RT Journal Article
ID 1956
A1 Nawi, Nazri Mohd
A1 Hamid, Noorhamreeza Abdul
A1 Samsudin, Noor Azah
A1 Mohd Yunus, Mohd Amin
A1 Ab Aziz, Mohd Firdaus
T1 Second Order Learning Algorithm for Back Propagation Neural Networks
JF International Journal on Advanced Science, Engineering and Information Technology
VO 7
IS 4
YR 2017
SP 1162
OP 1171
SN 2088-5334
PB INSIGHT - Indonesian Society for Knowledge and Human Development
K1 Back propagation algorithm; gradient descent; activation function; second order method; search direction;
AB 

Training of artificial neural networks (ANN) is normally a time consuming task due to iteratively search imposed by implicit nonlinearity of the network behavior.  In this work an improvement to ‘batch-mode’ offline training methods, gradient based or gradient free is proposed. The new procedure computes and improves the search direction along the negative gradient by introducing the ‘gain’ value of the activation functions and calculating the negative gradient on error with respect to the weights as well as ‘gain’ values in minimizing the error function. The main advantage of this new procedure is that it is easy to implement into other faster optimization algorithms such as conjugate gradient method and Quasi-Newton method. The pperformance of the proposed method implemented into conjugate gradient method and Quasi-Newton method is demonstrated by comparing the simulation results to the neural network toolbox for the chosen benchmark. The results show that the proposed method considerably improves the convergence rate significantly faster the learning process of the general back propagation algorithm because of it new efficient search direction.

LK http://ijaseit.insightsociety.org/index.php?option=com_content&view=article&id=9&Itemid=1&article_id=1956 DO - 10.18517/ijaseit.7.4.1956