一种新的混合文本编码方法:用于在线错误信息检测的认知注意语法模型

IF 2.7 3区 计算机科学 Q3 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Géraud Faye , Wassila Ouerdane , Guillaume Gadek , Souhir Gahbiche , Sylvain Gatepaille
{"title":"一种新的混合文本编码方法:用于在线错误信息检测的认知注意语法模型","authors":"Géraud Faye ,&nbsp;Wassila Ouerdane ,&nbsp;Guillaume Gadek ,&nbsp;Souhir Gahbiche ,&nbsp;Sylvain Gatepaille","doi":"10.1016/j.datak.2023.102230","DOIUrl":null,"url":null,"abstract":"<div><p>Most approaches for text encoding rely on the attention mechanism, at the core of the transformers architecture and large language models. The understanding of this mechanism is still limited and present inconvenients such as lack of interpretability, large requirements of data and low generalization. Based on current understanding of the attention mechanism, we propose CATS (Cognitive Attention To Syntax), a neurosymbolic attention encoding approach based on the syntactic understanding of texts. This approach has on-par to better performance compared to classical attention and displays expected advantages of neurosymbolic AI such as better functioning with little data and better explainability. This layer has been tested on the task of misinformation detection but is general and could be used in any task involving natural language processing.</p></div>","PeriodicalId":55184,"journal":{"name":"Data & Knowledge Engineering","volume":"148 ","pages":"Article 102230"},"PeriodicalIF":2.7000,"publicationDate":"2023-09-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A novel hybrid approach for text encoding: Cognitive Attention To Syntax model to detect online misinformation\",\"authors\":\"Géraud Faye ,&nbsp;Wassila Ouerdane ,&nbsp;Guillaume Gadek ,&nbsp;Souhir Gahbiche ,&nbsp;Sylvain Gatepaille\",\"doi\":\"10.1016/j.datak.2023.102230\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>Most approaches for text encoding rely on the attention mechanism, at the core of the transformers architecture and large language models. The understanding of this mechanism is still limited and present inconvenients such as lack of interpretability, large requirements of data and low generalization. Based on current understanding of the attention mechanism, we propose CATS (Cognitive Attention To Syntax), a neurosymbolic attention encoding approach based on the syntactic understanding of texts. This approach has on-par to better performance compared to classical attention and displays expected advantages of neurosymbolic AI such as better functioning with little data and better explainability. This layer has been tested on the task of misinformation detection but is general and could be used in any task involving natural language processing.</p></div>\",\"PeriodicalId\":55184,\"journal\":{\"name\":\"Data & Knowledge Engineering\",\"volume\":\"148 \",\"pages\":\"Article 102230\"},\"PeriodicalIF\":2.7000,\"publicationDate\":\"2023-09-28\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Data & Knowledge Engineering\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0169023X23000903\",\"RegionNum\":3,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Data & Knowledge Engineering","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0169023X23000903","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

摘要

大多数文本编码方法都依赖于注意力机制,这是transformers架构和大型语言模型的核心。对这一机制的理解仍然有限,并存在不便,如缺乏可解释性、对数据的要求大和泛化能力低。基于目前对注意机制的理解,我们提出了一种基于文本句法理解的神经符号注意编码方法CATS(Cognitive attention To Syntax)。与经典注意力相比,这种方法具有更好的性能,并显示了神经符号人工智能的预期优势,如在数据较少的情况下更好地发挥功能和更好的可解释性。这一层已经在错误信息检测任务中进行了测试,但它是通用的,可以用于任何涉及自然语言处理的任务。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
A novel hybrid approach for text encoding: Cognitive Attention To Syntax model to detect online misinformation

Most approaches for text encoding rely on the attention mechanism, at the core of the transformers architecture and large language models. The understanding of this mechanism is still limited and present inconvenients such as lack of interpretability, large requirements of data and low generalization. Based on current understanding of the attention mechanism, we propose CATS (Cognitive Attention To Syntax), a neurosymbolic attention encoding approach based on the syntactic understanding of texts. This approach has on-par to better performance compared to classical attention and displays expected advantages of neurosymbolic AI such as better functioning with little data and better explainability. This layer has been tested on the task of misinformation detection but is general and could be used in any task involving natural language processing.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Data & Knowledge Engineering
Data & Knowledge Engineering 工程技术-计算机:人工智能
CiteScore
5.00
自引率
0.00%
发文量
66
审稿时长
6 months
期刊介绍: Data & Knowledge Engineering (DKE) stimulates the exchange of ideas and interaction between these two related fields of interest. DKE reaches a world-wide audience of researchers, designers, managers and users. The major aim of the journal is to identify, investigate and analyze the underlying principles in the design and effective use of these systems.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信