Access the full text.
Sign up today, get DeepDyve free for 14 days.
T. Trafalis, R. Gilbert (2006)
Robust classification and regression using support vector machinesEur. J. Oper. Res., 173
J. Leiva-Murillo, L. Gómez-Chova, Gustau Camps-Valls (2013)
Multitask Remote Sensing Data ClassificationIEEE Transactions on Geoscience and Remote Sensing, 51
Jianfeng Feng, P. Williams (2001)
The generalization error of the symmetric and scaled support vector machinesIEEE transactions on neural networks, 12 5
V VAPNIK (2006)
Empirical Inference Science
Bo Jin, Yuchun Tang, Yanqing Zhang (2007)
Support vector machines with genetic fuzzy feature transformation for biomedical data classificationInf. Sci., 177
(2006)
Empirical Inference Science, New York: Springer
Ingo Steinwart, A. Christmann (2011)
Estimating conditional quantiles with the help of the pinball lossBernoulli, 17
T. Joachims (1998)
Text Categorization with Support Vector Machines: Learning with Many Relevant Features
Xingquan Zhu, Xindong Wu, Qijun Chen (2003)
Eliminating Class Noise in Large Datasets
Furno Marilena, Vistocco Domenico (2018)
Quantile RegressionWiley Series in Probability and Statistics
Yitian Xu, Laisheng Wang (2012)
A weighted twin support vector regressionKnowl. Based Syst., 33
Wenxin Zhu, P. Zhong (2017)
Minimum Class Variance SVM+ for data classificationAdvances in Data Analysis and Classification, 11
J. Rossouw, Du Jp, Benadé Aj, Jordaan Pc, Kotzé Jp, P. Jooste, Ferreira Jj (1983)
Coronary risk factor screening in three rural communities. The CORIS baseline study.South African medical journal = Suid-Afrikaanse tydskrif vir geneeskunde, 64 12
Min Yoon, Y. Yun, H. Nakayama (2003)
A role of total margin in support vector machinesProceedings of the International Joint Conference on Neural Networks, 2003., 3
Feng Cai (2011)
Advanced learning approaches based on SVM+ methodology.
A. Christmann, Ingo Steinwart (2007)
How SVMs can estimate quantiles and the median
Gustavo Batista, M. Monard (2003)
An analysis of four missing data treatment methods for supervised learningApplied Artificial Intelligence, 17
(2007)
Second Order Cone Programming Formulations for Robust Multi-Class Classification
J ROUSSEAUW (1983)
Coronary Risk Factor Screening in Three Rural CommunitiesSouth African Medical Journal, 64
(2001)
Support Vector Machine Regression in Chemometrics, Computing Science and Statistics
Xiaolin Huang, Lei Shi, J. Suykens (2014)
Support Vector Machine Classifier With Pinball LossIEEE Transactions on Pattern Analysis and Machine Intelligence, 36
E. Osuna, R. Freund, F. Girosi (1997)
Training support vector machines: an application to face detectionProceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition
(2013)
UCI Machine Learning Repository
Lichen Liang, V. Cherkassky (2008)
Connection between SVM+ and multi-task learning2008 IEEE International Joint Conference on Neural Networks (IEEE World Congress on Computational Intelligence)
Chan-Yun Yang, Jui-Jen Chou, Feng‐Li Lian (2013)
Robust classifier learning with fuzzy class labels for large-margin support vector machinesNeurocomputing, 99
Xingquan Zhu, Xindong Wu, Ying Yang (2004)
Error Detection and Impact-Sensitive Instance Ranking in Noisy Datasets
Vladimir Vapni (1995)
The Nature of Statistical Learning Theory
V. Vapnik, A. Vashist (2009)
A new learning paradigm: Learning using privileged informationNeural networks : the official journal of the International Neural Network Society, 22 5-6
Wenxin Zhu, Kuaini Wang, P. Zhong (2014)
IMPROVING SUPPORT VECTOR CLASSIFICATION BY LEARNING GROUP INFORMATION HIDDEN IN THE DATAICIC Express Letters, Part B: Applications An International Journal of Research and Surveys, 5
WX ZHU, KN WANG, P ZHONG (2014)
Improving Support Vector Classification by Learning Group Information Hidden in the DataICIC Express Letters, Part B: Applications, 5
Lijuan Cao (2003)
Support vector machines experts for time series forecastingNeurocomputing, 51
J. Suykens, J. Brabanter, L. Lukas, J. Vandewalle (2002)
Weighted least squares support vector machines: robustness and sparse approximationNeurocomputing, 48
P. Lingras, C. Butz (2007)
Rough set based 1-v-1 and 1-v-r approaches to support vector machine multi-classificationInf. Sci., 177
Xingquan Zhu, Xindong Wu, Qijun Chen (2006)
Bridging Local and Global Data Cleansing: Identifying Class Noise in Large, Distributed Data DatasetsData Mining and Knowledge Discovery, 12
Gao Huang, Shiji Song, Cheng Wu, Keyou You (2012)
Robust Support Vector Regression for Uncertain Input and Output DataIEEE Transactions on Neural Networks and Learning Systems, 23
J. Kubica, A. Moore (2003)
Probabilistic noise identification and data cleaningThird IEEE International Conference on Data Mining
G. Kitagawa (1996)
Monte Carlo Filter and Smoother for Non-Gaussian Nonlinear State Space ModelsJournal of Computational and Graphical Statistics, 5
Lichen Liang, V. Cherkassky (2007)
Learning Using Structured Data: Application to fMRI Data Analysis2007 International Joint Conference on Neural Networks
(2014)
ANewOne-Class SVMBased on Hidden Information
Junhua Zhang, Yuanyuan Wang (2008)
A rough margin based support vector machineInf. Sci., 178
H. Huang, Y. Liu (2002)
Fuzzy Support Vector Machines for Pattern Recognition and Data Mining, 4
(2001)
Support VectorMachine Regression in Chemometrics, Computing Science and Statistics”, in Proceedings of the 33rd Symposium on the Interface, American Statistical Association for the Interface
V VAPNIK (1998)
Statistical Learning Theory
Feng Cai, V. Cherkassky (2009)
SVM+ regression and multi-task learning2009 International Joint Conference on Neural Networks
V. Vapnik, A. Vashist, Natalya Pavlovitch (2009)
Learning using hidden information (Learning with teacher)2009 International Joint Conference on Neural Networks
P ZHONG, L WANG (2008)
Support Vector Regression with Input Data UncertaintyInternational Journal of Innovative Computing, Information and Control, 4
The hinge loss support vector machine (SVM) is sensitive to outliers. This paper proposes a new support vector machine with a pinball loss function (PSVM+). The new model is less sensitive to noise, especially the feature noise around the decision boundary. Furthermore, the PSVM+ is more stable than the hinge loss support vector machine plus (SVM+) for re-sampling. It also embeds the additional information into the corresponding optimization problem, which is helpful to further improve the learning performance. Meanwhile, the computational complexity of the PSVM+ is similar to that of the SVM+.
Journal of Classification – Springer Journals
Published: Mar 16, 2018
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get DeepDyve free for 14 days.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.