OQA: A question-answering dataset on orthodontic literature

Maxime Rousseau, Amal Zouaq, Nelly Huynh
{"title":"OQA: A question-answering dataset on orthodontic literature","authors":"Maxime Rousseau, Amal Zouaq, Nelly Huynh","doi":"10.1101/2024.07.05.24309412","DOIUrl":null,"url":null,"abstract":"Background: The near-exponential increase in the number of publications in orthodontics poses a challenge for efficient literature appraisal and evidence-based practice. Language models (LM) have the potential, through their question-answering fine-tuning, to assist clinicians and researchers in critical appraisal of scientific information and thus to improve decision-making.\nMethods: This paper introduces OrthodonticQA (OQA), the first question-answering dataset in the field of dentistry which is made publicly available under a permissive license. A framework is proposed which includes utilization of PICO information and templates for question formulation, demonstrating their broader applicability across various specialties within dentistry and healthcare. A selection of transformer LMs were trained on OQA to set performance baselines.\nResults: The best model achieved a mean F1 score of 77.61 (SD 0.26) and a score of 100/114 (87.72\\%) on human evaluation. Furthermore, when exploring performance according to grouped subtopics within the field of orthodontics, it was found that for all LMs the performance can vary considerably across topics.\nConclusion: Our findings highlight the importance of subtopic evaluation and superior performance of paired domain specific model and tokenizer.","PeriodicalId":501363,"journal":{"name":"medRxiv - Dentistry and Oral Medicine","volume":"366 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-07-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"medRxiv - Dentistry and Oral Medicine","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1101/2024.07.05.24309412","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Background: The near-exponential increase in the number of publications in orthodontics poses a challenge for efficient literature appraisal and evidence-based practice. Language models (LM) have the potential, through their question-answering fine-tuning, to assist clinicians and researchers in critical appraisal of scientific information and thus to improve decision-making. Methods: This paper introduces OrthodonticQA (OQA), the first question-answering dataset in the field of dentistry which is made publicly available under a permissive license. A framework is proposed which includes utilization of PICO information and templates for question formulation, demonstrating their broader applicability across various specialties within dentistry and healthcare. A selection of transformer LMs were trained on OQA to set performance baselines. Results: The best model achieved a mean F1 score of 77.61 (SD 0.26) and a score of 100/114 (87.72\%) on human evaluation. Furthermore, when exploring performance according to grouped subtopics within the field of orthodontics, it was found that for all LMs the performance can vary considerably across topics. Conclusion: Our findings highlight the importance of subtopic evaluation and superior performance of paired domain specific model and tokenizer.
OQA:正畸文献问题解答数据集
背景:口腔正畸领域的出版物数量几乎呈指数增长,这对高效的文献评估和循证实践提出了挑战。语言模型(LM)通过对问题解答的微调,有可能帮助临床医生和研究人员对科学信息进行批判性评估,从而改进决策:本文介绍了OrthodonticQA (OQA),这是牙科领域的第一个问题解答数据集,该数据集在许可授权下公开发布。我们提出了一个框架,其中包括利用 PICO 信息和模板来制定问题,这表明它们在牙科和医疗保健领域的各种专业中具有更广泛的适用性。在 OQA 上训练了一些转换 LM,以设定性能基线:结果:最佳模型的平均 F1 得分为 77.61(SD 0.26),人类评估得分为 100/114(87.72/%)。此外,在根据口腔正畸领域内的分组子课题探索性能时,我们发现所有 LM 的性能在不同主题之间会有很大差异:我们的研究结果凸显了子课题评估的重要性以及特定领域模型和标记器配对的卓越性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信