Multiple Words to Single Word Associations Using Masked Language Models

Yuya Soma, Y. Horiuchi, S. Kuroiwa
{"title":"Multiple Words to Single Word Associations Using Masked Language Models","authors":"Yuya Soma, Y. Horiuchi, S. Kuroiwa","doi":"10.1109/KST57286.2023.10086780","DOIUrl":null,"url":null,"abstract":"In this paper, we examine a word association task that predicts a correct associative word from five stimulus words using Masked Language Models (hereafter referred to as MLMs). For MLMs, we used BERT and gMLP. Since our word association task uses only nouns for both stimulus and associative words, we trained new models by restricting masked tokens to nouns. In our experiment, we input sentences such as “The prefecture associated with Mt. Fuji, Lake Hamana, … and eels is MASK. (富士山、浜名湖、⋯、うなぎから連想する都道府県は MASK です。),” so that MASK outputs an associative word. In the experiments, we also examined adding Japanese quotation marks 「」 before and after the MASK, i.e., 「MASK」. The experiment results showed that the highest percentage of correct answers, 49%, was obtained by adding 「」 before and after the MASK (74% of the correct answers were within the top five words).","PeriodicalId":351833,"journal":{"name":"2023 15th International Conference on Knowledge and Smart Technology (KST)","volume":"3 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-02-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 15th International Conference on Knowledge and Smart Technology (KST)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/KST57286.2023.10086780","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

In this paper, we examine a word association task that predicts a correct associative word from five stimulus words using Masked Language Models (hereafter referred to as MLMs). For MLMs, we used BERT and gMLP. Since our word association task uses only nouns for both stimulus and associative words, we trained new models by restricting masked tokens to nouns. In our experiment, we input sentences such as “The prefecture associated with Mt. Fuji, Lake Hamana, … and eels is MASK. (富士山、浜名湖、⋯、うなぎから連想する都道府県は MASK です。),” so that MASK outputs an associative word. In the experiments, we also examined adding Japanese quotation marks 「」 before and after the MASK, i.e., 「MASK」. The experiment results showed that the highest percentage of correct answers, 49%, was obtained by adding 「」 before and after the MASK (74% of the correct answers were within the top five words).
使用掩蔽语言模型的多词到单词关联
在本文中,我们研究了一个词关联任务,该任务使用掩蔽语言模型(以下简称mlm)从五个刺激词中预测一个正确的联想词。对于传销,我们使用BERT和gMLP。由于我们的单词关联任务仅使用名词作为刺激词和联想词,因此我们通过将掩码标记限制为名词来训练新模型。在我们的实验中,我们输入这样的句子:“与富士山、滨马湖和鳗鱼相关的县是MASK。”(富士山,浜名湖,⋯,うなぎから連想する都道府県は面具です),“这面具输出一个联想词。在实验中,我们还研究了在MASK前面和后面添加日文引号“”,即“MASK”。实验结果表明,在MASK的前面和后面加上“”,正确率最高,达到49%(74%的正确率在前5个单词之内)。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信