Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Multi-view subspace clustering for learning joint representation via low-rank sparse representation

Multi-view subspace clustering for learning joint representation via low-rank sparse representation Multi-view data are generally collected from distinct sources or domains characterized by consistent and specific properties. However, most existing multi-view subspace clustering approaches solely encode the self-representation structure through consistent representation or a set of specific representations, leaving the knowledge of the individual view unexploited and resulting in bad performance in self-representation structure. To address this issue, we propose a novel subspace clustering strategy in which the self-representation structure is contemplated through consistent and specific representations. Specifically, we apply the low-rank sparse representation scenario to uncover the global shared representation structure among all the views and deploy the nearest neighboring method to preserve the geometrical structure according to the consistent and specific representation. The L1\documentclass[12pt]{minimal}\usepackage{amsmath}\usepackage{wasysym}\usepackage{amsfonts}\usepackage{amssymb}\usepackage{amsbsy}\usepackage{mathrsfs}\usepackage{upgreek}\setlength{\oddsidemargin}{-69pt}\begin{document}$$L_1$$\end{document}-norm and frobenius norm are applied to the consistent and specific representation to promote a sparser solution and guarantee a grouping effect. Besides, a novel objective function is figured out, which goes under the optimization process through the alternating direction technique to evaluate the optimal solution. Finally, experiments conducted on several benchmark datasets show the effectiveness of the proposed method over several state-of-the-art algorithms. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Applied Intelligence Springer Journals

Multi-view subspace clustering for learning joint representation via low-rank sparse representation

Loading next page...
 
/lp/springer-journals/multi-view-subspace-clustering-for-learning-joint-representation-via-jFwg08SAG2

References (44)

Publisher
Springer Journals
Copyright
Copyright © The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2023. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
ISSN
0924-669X
eISSN
1573-7497
DOI
10.1007/s10489-023-04716-z
Publisher site
See Article on Publisher Site

Abstract

Multi-view data are generally collected from distinct sources or domains characterized by consistent and specific properties. However, most existing multi-view subspace clustering approaches solely encode the self-representation structure through consistent representation or a set of specific representations, leaving the knowledge of the individual view unexploited and resulting in bad performance in self-representation structure. To address this issue, we propose a novel subspace clustering strategy in which the self-representation structure is contemplated through consistent and specific representations. Specifically, we apply the low-rank sparse representation scenario to uncover the global shared representation structure among all the views and deploy the nearest neighboring method to preserve the geometrical structure according to the consistent and specific representation. The L1\documentclass[12pt]{minimal}\usepackage{amsmath}\usepackage{wasysym}\usepackage{amsfonts}\usepackage{amssymb}\usepackage{amsbsy}\usepackage{mathrsfs}\usepackage{upgreek}\setlength{\oddsidemargin}{-69pt}\begin{document}$$L_1$$\end{document}-norm and frobenius norm are applied to the consistent and specific representation to promote a sparser solution and guarantee a grouping effect. Besides, a novel objective function is figured out, which goes under the optimization process through the alternating direction technique to evaluate the optimal solution. Finally, experiments conducted on several benchmark datasets show the effectiveness of the proposed method over several state-of-the-art algorithms.

Journal

Applied IntelligenceSpringer Journals

Published: Oct 1, 2023

Keywords: Subspace clustering; Low-rank representation; Self-representation structure; Nearest neighbor

There are no references for this article.