A New Feature Selection Metric Based on Rough Sets and Information Gain in Text Classification

Rasim Çekik, Mahmut Kaya
{"title":"A New Feature Selection Metric Based on Rough Sets and Information Gain in Text Classification","authors":"Rasim Çekik, Mahmut Kaya","doi":"10.54287/gujsa.1379024","DOIUrl":null,"url":null,"abstract":"In text classification, taking words in text documents as features creates a very high dimensional feature space. This is known as the high dimensionality problem in text classification. The most common and effective way to solve this problem is to select an ideal subset of features using a feature selection approach. In this paper, a new feature selection approach called Rough Information Gain (RIG) is presented as a solution to the high dimensionality problem. Rough Information Gain extracts hidden and meaningful patterns in text data with the help of Rough Sets and computes a score value based on these patterns. The proposed approach utilizes the selection strategy of the Information Gain Selection (IG) approach when pattern extraction is completely uncertain. To demonstrate the performance of the Rough Information Gain in the experimental studies, the Micro-F1 success metric is used to compare with Information Gain Selection (IG), Chi-Square (CHI2), Gini Coefficient (GI), Discriminative Feature Selector (DFS) approaches. The proposed Rough Information Gain approach outperforms the other methods in terms of performance, according to the results.","PeriodicalId":134301,"journal":{"name":"Gazi University Journal of Science Part A: Engineering and Innovation","volume":"3 3","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2023-12-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Gazi University Journal of Science Part A: Engineering and Innovation","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.54287/gujsa.1379024","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

In text classification, taking words in text documents as features creates a very high dimensional feature space. This is known as the high dimensionality problem in text classification. The most common and effective way to solve this problem is to select an ideal subset of features using a feature selection approach. In this paper, a new feature selection approach called Rough Information Gain (RIG) is presented as a solution to the high dimensionality problem. Rough Information Gain extracts hidden and meaningful patterns in text data with the help of Rough Sets and computes a score value based on these patterns. The proposed approach utilizes the selection strategy of the Information Gain Selection (IG) approach when pattern extraction is completely uncertain. To demonstrate the performance of the Rough Information Gain in the experimental studies, the Micro-F1 success metric is used to compare with Information Gain Selection (IG), Chi-Square (CHI2), Gini Coefficient (GI), Discriminative Feature Selector (DFS) approaches. The proposed Rough Information Gain approach outperforms the other methods in terms of performance, according to the results.
文本分类中基于粗糙集和信息增益的新特征选择指标
在文本分类中,将文本文档中的单词作为特征会产生一个非常高维的特征空间。这就是所谓的文本分类中的高维问题。解决这一问题的最常见、最有效的方法是使用特征选择方法选择理想的特征子集。本文提出了一种名为 "粗糙信息增益"(Rough Information Gain,RIG)的新特征选择方法,作为高维问题的解决方案。粗糙信息增益借助粗糙集提取文本数据中隐藏的、有意义的模式,并根据这些模式计算分值。当模式提取完全不确定时,所提出的方法利用了信息增益选择(IG)方法的选择策略。为了在实验研究中证明粗糙信息增益的性能,我们使用 Micro-F1 成功度量标准与信息增益选择 (IG)、Chi-Square (CHI2)、基尼系数 (GI) 和判别特征选择器 (DFS) 方法进行比较。结果表明,所提出的粗糙信息增益方法在性能上优于其他方法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信