Get 20M+ Full-Text Papers For Less Than $1.50/day. Subscribe now for You or Your Team.

Learn More →

A New Correntropy-Based Conjugate Gradient Backpropagation Algorithm for Improving Training in Neural Networks.

A New Correntropy-Based Conjugate Gradient Backpropagation Algorithm for Improving Training in... Mean square error (MSE) is the most prominent criterion in training neural networks and has been employed in numerous learning problems. In this paper, we suggest a group of novel robust information theoretic backpropagation (BP) methods, as correntropy-based conjugate gradient BP (CCG-BP). CCG-BP algorithms converge faster than the common correntropy-based BP algorithms and have better performance than the common CG-BP algorithms based on MSE, especially in nonGaussian environments and in cases with impulsive noise or heavy-tailed distributions noise. In addition, a convergence analysis of this new type of method is particularly considered. Numerical results for several samples of function approximation, synthetic function estimation, and chaotic time series prediction illustrate that our new BP method is more robust than the MSE-based method in the sense of impulsive noise, especially when SNR is low. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png IEEE transactions on neural networks and learning systems Pubmed

A New Correntropy-Based Conjugate Gradient Backpropagation Algorithm for Improving Training in Neural Networks.

IEEE transactions on neural networks and learning systems , Volume 29 (12): 12 – Sep 9, 2019

A New Correntropy-Based Conjugate Gradient Backpropagation Algorithm for Improving Training in Neural Networks.


Abstract

Mean square error (MSE) is the most prominent criterion in training neural networks and has been employed in numerous learning problems. In this paper, we suggest a group of novel robust information theoretic backpropagation (BP) methods, as correntropy-based conjugate gradient BP (CCG-BP). CCG-BP algorithms converge faster than the common correntropy-based BP algorithms and have better performance than the common CG-BP algorithms based on MSE, especially in nonGaussian environments and in cases with impulsive noise or heavy-tailed distributions noise. In addition, a convergence analysis of this new type of method is particularly considered. Numerical results for several samples of function approximation, synthetic function estimation, and chaotic time series prediction illustrate that our new BP method is more robust than the MSE-based method in the sense of impulsive noise, especially when SNR is low.

Loading next page...
 
/lp/pubmed/a-new-correntropy-based-conjugate-gradient-backpropagation-algorithm-r023wgWM88

References

References for this paper are not available at this time. We will be adding them shortly, thank you for your patience.

ISSN
2162-237X
DOI
10.1109/TNNLS.2018.2827778
pmid
29993752

Abstract

Mean square error (MSE) is the most prominent criterion in training neural networks and has been employed in numerous learning problems. In this paper, we suggest a group of novel robust information theoretic backpropagation (BP) methods, as correntropy-based conjugate gradient BP (CCG-BP). CCG-BP algorithms converge faster than the common correntropy-based BP algorithms and have better performance than the common CG-BP algorithms based on MSE, especially in nonGaussian environments and in cases with impulsive noise or heavy-tailed distributions noise. In addition, a convergence analysis of this new type of method is particularly considered. Numerical results for several samples of function approximation, synthetic function estimation, and chaotic time series prediction illustrate that our new BP method is more robust than the MSE-based method in the sense of impulsive noise, especially when SNR is low.

Journal

IEEE transactions on neural networks and learning systemsPubmed

Published: Sep 9, 2019

There are no references for this article.