Please use this identifier to cite or link to this item: http://hdl.handle.net/1942/25518
Title: Training set edition using Rough Set Theory for Semi-supervised Classication
Authors: Grau, Isel
NAPOLES RUIZ, Gonzalo 
Sengupta, Dipankar
Garcia, Maria M.
Nowe, Ann
Issue Date: 2017
Publisher: ISFUROS
Source: Proceedings of the 2nd International Symposium on Fuzzy and Rough Sets (ISFUROS 2017),
Status: In Press
Abstract: Semi-supervised Classification (SSC) is becoming an attractive research filed due to the emergence of real-world problems on which the number of unlabeled examples exceeds the labeled ones. The natural complexity of this kind of problems rices up when designing algorithms with some interpretability features. In order to overcome this challenge, a novel SSC model called Self-labeling Grey-box (SlGb) has been recently proposed. The SlGb algorithm uses a black-box classifier to enlarge the dataset with the unlabeled examples and a white-box to build an interpretable model. In this paper, we attempt boosting the prediction rates of the SlGb algorithm by editing the training set using the knowledge acquired with rough sets. This can be achieved by weighting the instances according to their inclusion degree to rough information granules before building the final, white-box classification model.
Keywords: Semi-supervised Classification; Rough Set Theory; Training Set Edition; Instance Weighting; Self-labeling Grey-box
Document URI: http://hdl.handle.net/1942/25518
ISBN: 9789593122580
Category: C1
Type: Proceedings Paper
Validations: vabb 2021
Appears in Collections:Research publications

Files in This Item:
File Description SizeFormat 
ISFUROS 2017 Training set edition using rough set theory for semi-supervised classification.pdf
  Restricted Access
Peer-reviewed author version327.45 kBAdobe PDFView/Open    Request a copy
Show full item record

Page view(s)

44
checked on Sep 7, 2022

Download(s)

18
checked on Sep 7, 2022

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.