Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

An Optimal Weight Semi-Supervised Learning Machine for Neural Networks with Time Delay

An Optimal Weight Semi-Supervised Learning Machine for Neural Networks with Time Delay In this paper, an optimal weight semi-supervised learning machine for a single-hidden layer feedforward network (SLFN) with time delay is developed. Both input weights and output weights of the SLFN are globally optimized with manifold regularization. By feature mapping, input vectors can be placed at the prescribed positions in the feature space in the sense that the separability of all nonlinearly separable patterns can be maximized, unlabeled data can be leveraged to improve the classification accuracy when labeled data are scarce, and a high degree of recognition accuracy can be achieved with a small number of hidden nodes in the SLFN. Some simulation examples are presented to show the excellent performance of the proposed algorithm. . . . Keywords Neural networks Optimal weight learning Semi-supervised learning Manifold regularization Time delay 1 Introduction Feedforward neural networks (FNNs) have been extensively used in classification applications and regressions (Bishop 1995). One of FNNs single-hidden layer feedforward networks (SLFNs) play an important role. Recently, an interesting learning algorithm named extreme learning machine (ELM) (Huang et al. 2006) is proposed for SLFNs, where the input weights of an SLFN are uniformly randomly selected, and the output weights are trained with the batch learning type http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Journal of Classification Springer Journals

An Optimal Weight Semi-Supervised Learning Machine for Neural Networks with Time Delay

Journal of Classification , Volume OnlineFirst – Dec 10, 2019

Loading next page...
 
/lp/springer-journals/an-optimal-weight-semi-supervised-learning-machine-for-neural-networks-Wdy5wey0gU

References (17)

Publisher
Springer Journals
Copyright
Copyright
Subject
Statistics; Statistical Theory and Methods; Pattern Recognition; Bioinformatics; Signal,Image and Speech Processing; Psychometrics; Marketing
ISSN
0176-4268
eISSN
1432-1343
DOI
10.1007/s00357-019-09352-2
Publisher site
See Article on Publisher Site

Abstract

In this paper, an optimal weight semi-supervised learning machine for a single-hidden layer feedforward network (SLFN) with time delay is developed. Both input weights and output weights of the SLFN are globally optimized with manifold regularization. By feature mapping, input vectors can be placed at the prescribed positions in the feature space in the sense that the separability of all nonlinearly separable patterns can be maximized, unlabeled data can be leveraged to improve the classification accuracy when labeled data are scarce, and a high degree of recognition accuracy can be achieved with a small number of hidden nodes in the SLFN. Some simulation examples are presented to show the excellent performance of the proposed algorithm. . . . Keywords Neural networks Optimal weight learning Semi-supervised learning Manifold regularization Time delay 1 Introduction Feedforward neural networks (FNNs) have been extensively used in classification applications and regressions (Bishop 1995). One of FNNs single-hidden layer feedforward networks (SLFNs) play an important role. Recently, an interesting learning algorithm named extreme learning machine (ELM) (Huang et al. 2006) is proposed for SLFNs, where the input weights of an SLFN are uniformly randomly selected, and the output weights are trained with the batch learning type

Journal

Journal of ClassificationSpringer Journals

Published: Dec 10, 2019

There are no references for this article.