迈向个性化和简化说明文:预训练分类和神经网络协同建模

Safura Adeela Sukiman, Nor Azura Husin
{"title":"迈向个性化和简化说明文:预训练分类和神经网络协同建模","authors":"Safura Adeela Sukiman, Nor Azura Husin","doi":"10.1109/iscaie54458.2022.9794534","DOIUrl":null,"url":null,"abstract":"The goal of automatic text simplification is to reorganize complex text structures into simpler, more comprehendible texts while retaining their original meaning. The automatic text simplification model, coupled with the personalization element, makes it an indispensable tool for assisting students with learning disabilities who struggle to comprehend expository texts found in school textbooks. In recent years, neural networks have been widely embraced in simplified text generation, with most earlier researchers focusing on the Long Short-Term Memory (LSTM), Recurrent Neural Network (RNN), and Transformer models. In general, however, the majority of their efforts resulted in simple, generic texts, and a lack of cognitive-based personalization elements was found in their models. In this paper, we present the concept of generating personalized and simplified expository texts by joining both pre-trained classification and neural networks models. The pre-trained classification aims to predict complex text structures and phrases that give challenges for students with learning disabilities to comprehend, while the neural networks model is then used to generate simplified expository texts based on the predicted text complexity. The advantage of these joint models is the ability to generate simplified expository texts adapted to the cognitive level of students with learning disabilities. This opens up opportunities for continuously personalized learning, makes them less struggling, and increases their motivation to stay competitive with their peers.","PeriodicalId":395670,"journal":{"name":"2022 IEEE 12th Symposium on Computer Applications & Industrial Electronics (ISCAIE)","volume":"7 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-05-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Towards Personalized and Simplified Expository Texts: Pre-trained Classification and Neural Networks Co-Modeling\",\"authors\":\"Safura Adeela Sukiman, Nor Azura Husin\",\"doi\":\"10.1109/iscaie54458.2022.9794534\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The goal of automatic text simplification is to reorganize complex text structures into simpler, more comprehendible texts while retaining their original meaning. The automatic text simplification model, coupled with the personalization element, makes it an indispensable tool for assisting students with learning disabilities who struggle to comprehend expository texts found in school textbooks. In recent years, neural networks have been widely embraced in simplified text generation, with most earlier researchers focusing on the Long Short-Term Memory (LSTM), Recurrent Neural Network (RNN), and Transformer models. In general, however, the majority of their efforts resulted in simple, generic texts, and a lack of cognitive-based personalization elements was found in their models. In this paper, we present the concept of generating personalized and simplified expository texts by joining both pre-trained classification and neural networks models. The pre-trained classification aims to predict complex text structures and phrases that give challenges for students with learning disabilities to comprehend, while the neural networks model is then used to generate simplified expository texts based on the predicted text complexity. The advantage of these joint models is the ability to generate simplified expository texts adapted to the cognitive level of students with learning disabilities. This opens up opportunities for continuously personalized learning, makes them less struggling, and increases their motivation to stay competitive with their peers.\",\"PeriodicalId\":395670,\"journal\":{\"name\":\"2022 IEEE 12th Symposium on Computer Applications & Industrial Electronics (ISCAIE)\",\"volume\":\"7 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-05-21\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 IEEE 12th Symposium on Computer Applications & Industrial Electronics (ISCAIE)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/iscaie54458.2022.9794534\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE 12th Symposium on Computer Applications & Industrial Electronics (ISCAIE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/iscaie54458.2022.9794534","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

文本自动简化的目标是将复杂的文本结构重新组织成更简单、更容易理解的文本,同时保留其原始含义。自动文本简化模型与个性化元素相结合,使其成为帮助有学习障碍的学生理解学校教科书中解释性文本的不可或缺的工具。近年来,神经网络在简化文本生成中得到了广泛的应用,早期的研究大多集中在长短期记忆(LSTM)、循环神经网络(RNN)和变形模型上。然而,总的来说,他们的大部分努力导致了简单、通用的文本,并且在他们的模型中发现缺乏基于认知的个性化元素。在本文中,我们提出了通过结合预训练分类和神经网络模型来生成个性化和简化说明文的概念。预训练分类的目的是预测复杂的文本结构和短语,给有学习障碍的学生带来理解上的挑战,然后使用神经网络模型基于预测的文本复杂性生成简化的说明文。这些联合模型的优点是能够生成适合学习障碍学生认知水平的简化说明文。这为持续的个性化学习提供了机会,使他们不那么挣扎,并增加了他们与同龄人保持竞争的动力。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Towards Personalized and Simplified Expository Texts: Pre-trained Classification and Neural Networks Co-Modeling
The goal of automatic text simplification is to reorganize complex text structures into simpler, more comprehendible texts while retaining their original meaning. The automatic text simplification model, coupled with the personalization element, makes it an indispensable tool for assisting students with learning disabilities who struggle to comprehend expository texts found in school textbooks. In recent years, neural networks have been widely embraced in simplified text generation, with most earlier researchers focusing on the Long Short-Term Memory (LSTM), Recurrent Neural Network (RNN), and Transformer models. In general, however, the majority of their efforts resulted in simple, generic texts, and a lack of cognitive-based personalization elements was found in their models. In this paper, we present the concept of generating personalized and simplified expository texts by joining both pre-trained classification and neural networks models. The pre-trained classification aims to predict complex text structures and phrases that give challenges for students with learning disabilities to comprehend, while the neural networks model is then used to generate simplified expository texts based on the predicted text complexity. The advantage of these joint models is the ability to generate simplified expository texts adapted to the cognitive level of students with learning disabilities. This opens up opportunities for continuously personalized learning, makes them less struggling, and increases their motivation to stay competitive with their peers.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信