Get 20M+ Full-Text Papers For Less Than $1.50/day. Subscribe now for You or Your Team.

Learn More →

Hybrid PSO (SGPSO) with the Incorporation of Discretization Operator for Training RBF Neural Network and Optimal Feature Selection

Hybrid PSO (SGPSO) with the Incorporation of Discretization Operator for Training RBF Neural... Particle swarm optimization (PSO) is a computational method that emerged recently based on swarm intelligence techniques for resolving optimization complications. The popularity and wide acceptance of PSO among the research community are because of its simplicity in implementation and fewer parameters to fine-tune. The velocity and position of particles are adjusted in the typical PSO algorithm based on their personal best and global best positions. The concept of updating the particle’s velocity concerning personal and global optimal positions is simple and appealing. However, this learning technique might result in the “oscillation” and “two-steps-forward, one-step-back” phenomena. To avoid such imperfections associated with the standard PSO, we have suggested a hybrid PSO algorithm and have named it single-guided-PSO (SGPSO) that incorporates the discretization operator with standard PSO to effectively stabilize the oscillation effect and achieve a smoother balance between exploration and exploitation capability with an enhanced convergence rate towards global optima. Following IEEE-CEC-2014, a set of 14 basic functions and 30 advanced standard functions were chosen to validate the proficiency of the proposed SGPSO technique. The results of the suggested strategy have also been compared with some state-of-the-art existing meta-heuristic methodologies in literature. Two nonparametric tests have also been performed to substantiate statistical significance. In addition, the suggested approach SGPSO has been used to train radial basis function neural network by selecting datasets from the UCI repository. Furthermore, the same SGPSO is utilized to select optimum features from benchmark datasets, simultaneously maintaining the accuracy to minimize the complexity of neural networks. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Arabian Journal for Science and Engineering Springer Journals

Hybrid PSO (SGPSO) with the Incorporation of Discretization Operator for Training RBF Neural Network and Optimal Feature Selection

Loading next page...
 
/lp/springer-journals/hybrid-pso-sgpso-with-the-incorporation-of-discretization-operator-for-Apc3z0FbrJ

References (68)

Publisher
Springer Journals
Copyright
Copyright © King Fahd University of Petroleum & Minerals 2022. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
ISSN
2193-567X
eISSN
2191-4281
DOI
10.1007/s13369-022-07408-x
Publisher site
See Article on Publisher Site

Abstract

Particle swarm optimization (PSO) is a computational method that emerged recently based on swarm intelligence techniques for resolving optimization complications. The popularity and wide acceptance of PSO among the research community are because of its simplicity in implementation and fewer parameters to fine-tune. The velocity and position of particles are adjusted in the typical PSO algorithm based on their personal best and global best positions. The concept of updating the particle’s velocity concerning personal and global optimal positions is simple and appealing. However, this learning technique might result in the “oscillation” and “two-steps-forward, one-step-back” phenomena. To avoid such imperfections associated with the standard PSO, we have suggested a hybrid PSO algorithm and have named it single-guided-PSO (SGPSO) that incorporates the discretization operator with standard PSO to effectively stabilize the oscillation effect and achieve a smoother balance between exploration and exploitation capability with an enhanced convergence rate towards global optima. Following IEEE-CEC-2014, a set of 14 basic functions and 30 advanced standard functions were chosen to validate the proficiency of the proposed SGPSO technique. The results of the suggested strategy have also been compared with some state-of-the-art existing meta-heuristic methodologies in literature. Two nonparametric tests have also been performed to substantiate statistical significance. In addition, the suggested approach SGPSO has been used to train radial basis function neural network by selecting datasets from the UCI repository. Furthermore, the same SGPSO is utilized to select optimum features from benchmark datasets, simultaneously maintaining the accuracy to minimize the complexity of neural networks.

Journal

Arabian Journal for Science and EngineeringSpringer Journals

Published: Aug 1, 2023

Keywords: Hybrid-PSO; Discretization operator; RBFNN; Classification; Feature selection; Global optimization

There are no references for this article.