Neural Network Pruning in Unsupervised Aspect Detection based on Aspect Embedding

Muhammad Haris Maulana, M. L. Khodra
{"title":"Neural Network Pruning in Unsupervised Aspect Detection based on Aspect Embedding","authors":"Muhammad Haris Maulana, M. L. Khodra","doi":"10.22146/ijccs.72981","DOIUrl":null,"url":null,"abstract":" Aspect detection systems for online reviews, especially based on unsupervised models, are considered better strategically to process online reviews, generally a very large collection of unstructured data.  Aspect embedding-based deep learning models are designed for this problem however they still rely on redundant word embedding and they are sensitive to initialization which may have a significant impact on model performance. In this research, a pruning approach is used to reduce the redundancy of deep learning model connections and is expected to produce a model with similar or better performance. This research includes several experiments and comparisons of the results of pruning the model network weights based on the general neural network pruning strategy and the lottery ticket hypothesis. The result of this research is that pruning of the unsupervised aspect detection model, in general, can produce smaller submodels with similar performance even with a significant amount of weights pruned. Our sparse model with 80% of its total weight pruned has a similar performance to the original model. Our current pruning implementation, however, has not been able to produce sparse models with better performance.","PeriodicalId":31625,"journal":{"name":"IJCCS Indonesian Journal of Computing and Cybernetics Systems","volume":" ","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2022-10-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IJCCS Indonesian Journal of Computing and Cybernetics Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.22146/ijccs.72981","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

 Aspect detection systems for online reviews, especially based on unsupervised models, are considered better strategically to process online reviews, generally a very large collection of unstructured data.  Aspect embedding-based deep learning models are designed for this problem however they still rely on redundant word embedding and they are sensitive to initialization which may have a significant impact on model performance. In this research, a pruning approach is used to reduce the redundancy of deep learning model connections and is expected to produce a model with similar or better performance. This research includes several experiments and comparisons of the results of pruning the model network weights based on the general neural network pruning strategy and the lottery ticket hypothesis. The result of this research is that pruning of the unsupervised aspect detection model, in general, can produce smaller submodels with similar performance even with a significant amount of weights pruned. Our sparse model with 80% of its total weight pruned has a similar performance to the original model. Our current pruning implementation, however, has not been able to produce sparse models with better performance.
基于方面嵌入的无监督方面检测中的神经网络修剪
在线评论的方面检测系统,特别是基于无监督模型的,被认为是处理在线评论的更好策略,通常是一个非常大的非结构化数据集合。基于方面嵌入的深度学习模型是针对这一问题而设计的,但它仍然依赖于冗余词嵌入,并且对初始化很敏感,这可能会对模型的性能产生重大影响。在本研究中,使用修剪方法来减少深度学习模型连接的冗余,并期望产生具有相似或更好性能的模型。本研究包括基于一般神经网络修剪策略和彩票假设的模型网络权值修剪的实验和结果比较。本研究的结果是,对无监督方面检测模型进行剪枝,一般来说,即使进行大量的权值剪枝,也能产生具有相似性能的更小的子模型。我们的稀疏模型在其总权值的80%被修剪后,其性能与原始模型相似。然而,我们目前的修剪实现还不能产生性能更好的稀疏模型。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
20
审稿时长
12 weeks
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信