深度神经网络在自然语言处理中按来源和功能分类需求的有效性——BERT在系统需求中的应用

IF 2.9 3区 工程技术 Q2 ENGINEERING, MECHANICAL
Jesse Mullis, Cheng Chen, Scott Ferguson, Beshoy Morkos
{"title":"深度神经网络在自然语言处理中按来源和功能分类需求的有效性——BERT在系统需求中的应用","authors":"Jesse Mullis, Cheng Chen, Scott Ferguson, Beshoy Morkos","doi":"10.1115/1.4063764","DOIUrl":null,"url":null,"abstract":"Abstract Given the foundational role of system requirements in design projects, designers can benefit from classifying, comparing, and observing connections between requirements. Manually undertaking these processes, however, can be laborious and time-consuming. Previous studies have employed Bidirectional Encoder Representations from Transformers (BERT), a state-of-the-art natural language processing (NLP) deep neural network model, to automatically analyze written requirements. Yet, it remains unclear whether BERT can sufficiently capture the nuances that differentiate requirements between and within design documents. This work evaluates BERT’s performance on two requirement classification tasks (one inter- document and one intra-document) executed on a corpus of 1,303 requirements sourced from five system design projects. First, in the “parent document classification” task, a BERT model is fine-tuned to classify requirements according to their originating project. A separate BERT model is then fine-tuned on a “functional classification” task where each requirement is classified as either functional or nonfunctional. Our results also include a comparison with a baseline model, Word2Vec, and demonstrate that our model achieves higher classification accuracy. When evaluated on test sets, the former model receives a Matthews correlation coefficient (MCC) of 0.95, while the latter receives an MCC of 0.82, indicating BERT’s ability to reliably distinguish requirements. This work then explores the application of BERT’s representations, known as embeddings, to identify similar requirements and predict requirement change.","PeriodicalId":50137,"journal":{"name":"Journal of Mechanical Design","volume":"6 8","pages":"0"},"PeriodicalIF":2.9000,"publicationDate":"2023-11-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Efficacy of Deep Neural Networks in Natural Language Processing for Classifying Requirements by Origin and Functionality: An Application of BERT in System Requirement\",\"authors\":\"Jesse Mullis, Cheng Chen, Scott Ferguson, Beshoy Morkos\",\"doi\":\"10.1115/1.4063764\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Abstract Given the foundational role of system requirements in design projects, designers can benefit from classifying, comparing, and observing connections between requirements. Manually undertaking these processes, however, can be laborious and time-consuming. Previous studies have employed Bidirectional Encoder Representations from Transformers (BERT), a state-of-the-art natural language processing (NLP) deep neural network model, to automatically analyze written requirements. Yet, it remains unclear whether BERT can sufficiently capture the nuances that differentiate requirements between and within design documents. This work evaluates BERT’s performance on two requirement classification tasks (one inter- document and one intra-document) executed on a corpus of 1,303 requirements sourced from five system design projects. First, in the “parent document classification” task, a BERT model is fine-tuned to classify requirements according to their originating project. A separate BERT model is then fine-tuned on a “functional classification” task where each requirement is classified as either functional or nonfunctional. Our results also include a comparison with a baseline model, Word2Vec, and demonstrate that our model achieves higher classification accuracy. When evaluated on test sets, the former model receives a Matthews correlation coefficient (MCC) of 0.95, while the latter receives an MCC of 0.82, indicating BERT’s ability to reliably distinguish requirements. This work then explores the application of BERT’s representations, known as embeddings, to identify similar requirements and predict requirement change.\",\"PeriodicalId\":50137,\"journal\":{\"name\":\"Journal of Mechanical Design\",\"volume\":\"6 8\",\"pages\":\"0\"},\"PeriodicalIF\":2.9000,\"publicationDate\":\"2023-11-13\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Mechanical Design\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1115/1.4063764\",\"RegionNum\":3,\"RegionCategory\":\"工程技术\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"ENGINEERING, MECHANICAL\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Mechanical Design","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1115/1.4063764","RegionNum":3,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, MECHANICAL","Score":null,"Total":0}
引用次数: 0

摘要

考虑到系统需求在设计项目中的基础作用,设计师可以从分类、比较和观察需求之间的联系中获益。然而,手动执行这些过程既费力又耗时。以前的研究使用了来自变形金刚的双向编码器表示(BERT),一种最先进的自然语言处理(NLP)深度神经网络模型,来自动分析书面需求。然而,BERT是否能够充分捕获设计文档之间和内部区分需求的细微差别仍然不清楚。这项工作评估了BERT在两个需求分类任务(一个文档间和一个文档内)上的性能,这些任务在来自五个系统设计项目的1303个需求语料库上执行。首先,在“父文档分类”任务中,对BERT模型进行微调,以根据原始项目对需求进行分类。然后在“功能分类”任务上对单独的BERT模型进行微调,其中每个需求被分类为功能性或非功能性。我们的结果还包括与基线模型Word2Vec的比较,并证明我们的模型实现了更高的分类精度。在测试集上进行评估时,前者模型的Matthews相关系数(MCC)为0.95,后者的MCC为0.82,表明BERT能够可靠地区分需求。然后,这项工作探索了BERT表示(即嵌入)的应用,以识别类似的需求并预测需求变化。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Efficacy of Deep Neural Networks in Natural Language Processing for Classifying Requirements by Origin and Functionality: An Application of BERT in System Requirement
Abstract Given the foundational role of system requirements in design projects, designers can benefit from classifying, comparing, and observing connections between requirements. Manually undertaking these processes, however, can be laborious and time-consuming. Previous studies have employed Bidirectional Encoder Representations from Transformers (BERT), a state-of-the-art natural language processing (NLP) deep neural network model, to automatically analyze written requirements. Yet, it remains unclear whether BERT can sufficiently capture the nuances that differentiate requirements between and within design documents. This work evaluates BERT’s performance on two requirement classification tasks (one inter- document and one intra-document) executed on a corpus of 1,303 requirements sourced from five system design projects. First, in the “parent document classification” task, a BERT model is fine-tuned to classify requirements according to their originating project. A separate BERT model is then fine-tuned on a “functional classification” task where each requirement is classified as either functional or nonfunctional. Our results also include a comparison with a baseline model, Word2Vec, and demonstrate that our model achieves higher classification accuracy. When evaluated on test sets, the former model receives a Matthews correlation coefficient (MCC) of 0.95, while the latter receives an MCC of 0.82, indicating BERT’s ability to reliably distinguish requirements. This work then explores the application of BERT’s representations, known as embeddings, to identify similar requirements and predict requirement change.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Journal of Mechanical Design
Journal of Mechanical Design 工程技术-工程:机械
CiteScore
8.00
自引率
18.20%
发文量
139
审稿时长
3.9 months
期刊介绍: The Journal of Mechanical Design (JMD) serves the broad design community as the venue for scholarly, archival research in all aspects of the design activity with emphasis on design synthesis. JMD has traditionally served the ASME Design Engineering Division and its technical committees, but it welcomes contributions from all areas of design with emphasis on synthesis. JMD communicates original contributions, primarily in the form of research articles of considerable depth, but also technical briefs, design innovation papers, book reviews, and editorials. Scope: The Journal of Mechanical Design (JMD) serves the broad design community as the venue for scholarly, archival research in all aspects of the design activity with emphasis on design synthesis. JMD has traditionally served the ASME Design Engineering Division and its technical committees, but it welcomes contributions from all areas of design with emphasis on synthesis. JMD communicates original contributions, primarily in the form of research articles of considerable depth, but also technical briefs, design innovation papers, book reviews, and editorials.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信