Access the full text.
Sign up today, get DeepDyve free for 14 days.
C Gao, J Zhou (2021)
Granular conditional entropy-based attribute reduction for partially labeled data with proxy labelsInf Sci, 580
K Sheng, W Wang (2020)
Neighborhood Discernibility Degree Incremental Attribute Reduction Algorithm for Mixed DataActa Electron Sin, 48
U. Khaire, R. Dhanalakshmi (2019)
Stability of feature selection algorithm: A reviewJ. King Saud Univ. Comput. Inf. Sci., 34
C. Gao, Jie Zhou, D. Miao, Xiaodong Yue, Jun Wan (2021)
Granular conditional entropy-based attribute reduction for partially labeled data with proxy labelsArXiv, abs/2101.09495
X Li, Y Wang (2020)
A Survey on Sparse Learning Models for Feature SelectionIEEE Transactions on Cybernetics, 52
J Ba, Y Chen (2021)
Quick Strategy for Searching Granular Ball Rough Set Based ReductJournal of Nanjing University of Science and Technology, 45
Haibo Jiang, B. Hu (2022)
On (O, G)-fuzzy rough sets based on overlap and grouping functions over complete latticesInt. J. Approx. Reason., 144
Emrah Hançer, Bing Xue, Mengjie Zhang (2022)
Fuzzy filter cost-sensitive feature selection with differential evolutionKnowl. Based Syst., 241
Ye Liu, Lidi Zheng, Yeliang Xiu, Hong Yin, Suyun Zhao, Xizhao Wang, Hong Chen, Cuiping Li (2020)
Discernibility matrix based incremental feature selection on fused decision tablesInt. J. Approx. Reason., 118
Wei Wei, Xiaoying Wu, Jiye Liang, Junbiao Cui, Yijun Sun (2018)
Discernibility matrix based incremental attribute reduction for dynamic dataKnowl. Based Syst., 140
Pei-Yan Huang, Xiaowei Yang (2022)
Unsupervised feature selection via adaptive graph and dependency scorePattern Recognit., 127
W Shu, Z Yan (2022)
Information granularity-based incremental feature selection for partially labeled hybrid dataIntelligent Data Analysis, 26
L Li, M Li (2019)
A simple discernibility matrix for attribute reduction in formal concept analysis based on granular conceptsJournal of Intelligent & Fuzzy Systems, 37
Changzhong Wang, Yang Huang, Mingwen Shao, Xiaodong Fan (2019)
Fuzzy rough set-based attribute reduction using distance measuresKnowl. Based Syst., 164
Xiansheng Rao, Xibei Yang, Xin Yang, Xiangjian Chen, Dun Liu, Y. Qian (2020)
Quickly calculating reduct: An attribute relationship based approachKnowl. Based Syst., 200
Jihong Wan, Hongmei Chen, Zhong Yuan, Tianrui Li, Xiaoling Yang, Binbin Sang (2021)
A novel hybrid feature selection method considering feature interaction in neighborhood rough setKnowl. Based Syst., 227
Jie Zhao, Jia-ming Liang, Zhenning Dong, Deyu Tang, Zhen Liu (2020)
Accelerating information entropy-based feature selection using rough set theory with classified nested equivalence classesPattern Recognit., 107
X Li (2020)
1642IEEE Transactions on Cybernetics, 52
Qing-Qing Pang, Li Zhang (2020)
Semi-supervised neighborhood discrimination index for feature selectionKnowl. Based Syst., 204
Q Hu, D Yu (2022)
Granular computing based machine learning in the era of big dataInf Sci, 591
(2022)
Uncertainty Measurement and Attribute Reduction Algorithm Based on Kernel Similarity Rough Set Model
(2022)
WITHDRAWN: Granular computing based machine learning in the era of big data
Jingjing Xie, B. Hu, Haibo Jiang (2022)
A novel method to attribute reduction based on weighted neighborhood probabilistic rough setsInt. J. Approx. Reason., 144
S Xia, Z Zhang (2020)
GBNRS: A Novel Rough Set Algorithm for Fast Adaptive Attribute Reduction in ClassificationIEEE Trans Knowl Data Eng, 34
X Zhang, C Mei (2020)
Active incremental feature selection using a fuzzy-rough-set-based information entropyIEEE Transacions on Fuzzy Systems, 28
Sheng Luo, D. Miao, Zhifei Zhang, Yuanjian Zhang, Shengdan Hu (2020)
A neighborhood rough set model with nominal metric embeddingInf. Sci., 520
S Xia, D Peng (2020)
A Fast Adaptive k-means with No BoundsIEEE Trans Pattern Anal Mach Intell, 44
Zehua Jiang, Keyu Liu, Jingjing Song, Xibei Yang, Jinhai Li, Y. Qian (2020)
Accelerator for crosswise computing reductAppl. Soft Comput., 98
Shuyin Xia, Yunsheng Liu, Xin Ding, Guoyin Wang, Hong Yu, Yuoguo Luo (2019)
Granular ball computing classifiers for efficient, scalable and robust learningInf. Sci., 483
Hexiang Bai, Deyu Li, Y. Ge, Jinfeng Wang, Feng Cao (2021)
Spatial rough set-based geographical detectors for nominal target variablesInf. Sci., 586
Zehua Jiang, Xibei Yang, Hualong Yu, Dun Liu, Pingxin Wang, Y. Qian (2019)
Accelerator for multi-granularity attribute reductionKnowl. Based Syst., 177
Lin Sun, Xiaoyu Zhang, Y. Qian, Jiucheng Xu, Shiguang Zhang (2019)
Feature selection using neighborhood entropy-based uncertainty measures for gene expression data classificationInf. Sci., 502
UM Khaire (2022)
1060Journal of King Saud University - Computer and Information Sciences, 34
J Dai, Q Hu (2017)
Attribute selection for partially labeled categorical data by rough set approachIEEE Transcations on Cybernetics, 47
Y. Qian, Xinyan Liang, Qi Wang, Jiye Liang, Bing Liu, A. Skowron, Yiyu Yao, Jianmin Ma, C. Dang (2018)
Local rough set: A solution to rough data analysis in big dataInt. J. Approx. Reason., 97
R Lin, J Li (2021)
Attribute reduction in fuzzy multi-covering decision systems via observational-consistency and fuzzy discernibilityJournal of Intelligent & Fuzzy Systems, 40
Peng Ni, Suyun Zhao, Xizhao Wang, Hong Chen, Cuiping Li (2019)
PARA: A positive-region based attribute reduction acceleratorInf. Sci., 503
Keyu Liu, Eric Tsang, Jingjing Song, Hualong Yu, Xiangjian Chen, Xibei Yang (2018)
Neighborhood attribute reduction approach to partially labeled dataGranular Computing, 5
Jianhua Dai, Wentao Wang, Haowei Tian, Liang Liu (2013)
Attribute selection based on a new conditional entropy for incomplete decision systemsKnowl. Based Syst., 39
Jianhua Dai, Qiong Liu (2021)
Semi-supervised attribute reduction for interval data based on misclassification costInternational Journal of Machine Learning and Cybernetics, 13
W. Shu, Wenbin Qian, Yonghong Xie (2020)
Incremental feature selection for dynamic hybrid data using neighborhood rough setKnowl. Based Syst., 194
Kyung-jun Kim, C. Jun (2018)
Rough set model based feature selection for mixed-type data with feature space decompositionExpert Syst. Appl., 103
Yan Chen, Pingxin Wang, Xibei Yang, Jusheng Mi, Dun Liu (2021)
Granular ball guided selector for attribute reductionKnowl. Based Syst., 229
Keyu Liu, Xibei Yang, Hualong Yu, Jusheng Mi, Pingxin Wang, Xiangjian Chen (2019)
Rough set based semi-supervised feature selection via ensemble selectorKnowl. Based Syst., 165
F Wang, J Liu (2018)
Semi-supervised feature selection algorithm based on information entropyComputer Science, 45
Zehua Jiang, Keyu Liu, Xibei Yang, Hualong Yu, H. Fujita, Y. Qian (2020)
Accelerator for supervised neighborhood based attribute reductionInt. J. Approx. Reason., 119
C Wang, Y Huang (2019)
Feature selection based on neighborhood self-informationIEEE Transactions on Cybernetics, 50
Feature selection can effectively decrease data dimensions by selecting a relevant feature subset. Rough set theory provides a powerful theoretical framework for the feature selection of categorical data with complete labels. However, in reality, the given datasets have only a small number of objects with label information and many unlabelled objects. Furthermore, most of feature selection approaches are computationally expensive. To address the above problems, a semisupervised feature selection algorithm based on neighbourhood discernibility with pseudolabelled granular balls is proposed. First, the set of granular balls based on the purity is generated, which reduces the universe space by sampling. Then, the neighbourhood discernibility is proposed to validate the importance of the candidate features for both labelled and unlabelled objects. Finally, an ensemble voting algorithm is designed to execute feature selection, and a feature subset with satisfactory performance is selected fairly not arbitrarily. On UCI datasets, experimental results verify the advantage of the proposed feature selection algorithm in terms of the feature subset size, classification accuracy and computational time against other algorithms.
Applied Intelligence – Springer Journals
Published: Jun 29, 2023
Keywords: Feature selection; Granular ball; Mixed-type data; Semi-supervised; Ensemble selector
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get DeepDyve free for 14 days.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.