Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Feature selection using class-level regularized self-representation

Feature selection using class-level regularized self-representation Feature selection aims at selecting representative features from the original high-dimensional feature set, and it has drawn much attention in most real-world applications like data mining and pattern recognition. This paper studies feature selection problem from the viewpoint of feature self-representation. Traditionally, feature self-representation is only performed on the whole-level reconstruction, whereas the feature selection ability is insufficient owing to the intra-class variations. To address this problem, we propose a new feature selection method, i.e., class-level regularized self-representation (CLRSR). In the proposed method, a class-level reconstruction term is designed to reduce intra-class variations of the samples from different categories. By jointly optimizing the whole-level reconstruction and the class-level reconstruction, CLRSR is able to select more discriminative and informative features. Moreover, an iterative algorithm is proposed to minimize cost function of CLRSR, and its convergence is proven in theory. By comparing with several state-of-the-art feature selection methods, experimental evaluations on six benchmark datasets have verified effectiveness and superiority of CLRSR. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Applied Intelligence Springer Journals

Feature selection using class-level regularized self-representation

Applied Intelligence , Volume 53 (11) – Jun 1, 2023

Loading next page...
 
/lp/springer-journals/feature-selection-using-class-level-regularized-self-representation-hdB1v1cwLw

References (40)

Publisher
Springer Journals
Copyright
Copyright © The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2022. Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
ISSN
0924-669X
eISSN
1573-7497
DOI
10.1007/s10489-022-04177-w
Publisher site
See Article on Publisher Site

Abstract

Feature selection aims at selecting representative features from the original high-dimensional feature set, and it has drawn much attention in most real-world applications like data mining and pattern recognition. This paper studies feature selection problem from the viewpoint of feature self-representation. Traditionally, feature self-representation is only performed on the whole-level reconstruction, whereas the feature selection ability is insufficient owing to the intra-class variations. To address this problem, we propose a new feature selection method, i.e., class-level regularized self-representation (CLRSR). In the proposed method, a class-level reconstruction term is designed to reduce intra-class variations of the samples from different categories. By jointly optimizing the whole-level reconstruction and the class-level reconstruction, CLRSR is able to select more discriminative and informative features. Moreover, an iterative algorithm is proposed to minimize cost function of CLRSR, and its convergence is proven in theory. By comparing with several state-of-the-art feature selection methods, experimental evaluations on six benchmark datasets have verified effectiveness and superiority of CLRSR.

Journal

Applied IntelligenceSpringer Journals

Published: Jun 1, 2023

Keywords: Feature selection; Self-representation; Class-level reconstruction; Iterative algorithm

There are no references for this article.