Properly Offer Options to Improve the Practicality of Software Document Completion Tools

Zhipeng Cai, Songqiang Chen, Xiaoyuan Xie
{"title":"Properly Offer Options to Improve the Practicality of Software Document Completion Tools","authors":"Zhipeng Cai, Songqiang Chen, Xiaoyuan Xie","doi":"10.1109/ICPC58990.2023.00038","DOIUrl":null,"url":null,"abstract":"With the great progress in deep learning and natural language processing, many completion tools are proposed to help practitioners efficiently fill in various fields in software document. However, most of these tools offer their users only one option and this option generally requires much revision to meet a satisfactory quality, which hurts much practicality of the completion tools. By finding that the beam search model of such tools often generates a much better output at relatively high confidence and considering the interactive use of such tools, we advise such tools to offer multiple high-confidence model outputs for more chances of offering a good option. And we further suggest these tools offer dissimilar outputs to expand the chance of including a better output in a few options. To evaluate our whole idea, we design a clustering-based initial method to help these tools properly offer some dissimilar model outputs as options. We adopt this method to improve nine completion tools for three software document fields. Results show it can help all the nine tools offer an option that needs less revision from users and thus effectively improve the practicality of tools.","PeriodicalId":376593,"journal":{"name":"2023 IEEE/ACM 31st International Conference on Program Comprehension (ICPC)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 IEEE/ACM 31st International Conference on Program Comprehension (ICPC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICPC58990.2023.00038","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

With the great progress in deep learning and natural language processing, many completion tools are proposed to help practitioners efficiently fill in various fields in software document. However, most of these tools offer their users only one option and this option generally requires much revision to meet a satisfactory quality, which hurts much practicality of the completion tools. By finding that the beam search model of such tools often generates a much better output at relatively high confidence and considering the interactive use of such tools, we advise such tools to offer multiple high-confidence model outputs for more chances of offering a good option. And we further suggest these tools offer dissimilar outputs to expand the chance of including a better output in a few options. To evaluate our whole idea, we design a clustering-based initial method to help these tools properly offer some dissimilar model outputs as options. We adopt this method to improve nine completion tools for three software document fields. Results show it can help all the nine tools offer an option that needs less revision from users and thus effectively improve the practicality of tools.
适当提供选项以提高软件文档补全工具的实用性
随着深度学习和自然语言处理技术的进步,许多补全工具被提出,以帮助从业者有效地填充软件文档中的各个领域。然而,大多数工具只提供给用户一种选择,而这种选择通常需要多次修改才能达到令人满意的质量,这损害了完井工具的实用性。通过发现此类工具的波束搜索模型通常在相对高置信度下产生更好的输出,并考虑到此类工具的交互式使用,我们建议此类工具提供多个高置信度模型输出,以便有更多机会提供良好的选择。我们进一步建议这些工具提供不同的输出,以扩大在几个选项中包含更好输出的机会。为了评估我们的整个想法,我们设计了一个基于聚类的初始方法,以帮助这些工具正确地提供一些不同的模型输出作为选项。采用该方法对3个软件文档字段的9个补全工具进行了改进。结果表明,该方法可以帮助九种工具都提供一种需要用户较少修改的选项,从而有效地提高工具的实用性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信