基于开发者社区的大型语言模型趋势分析:对堆栈溢出的关注

IF 2.4 Q3 COMPUTER SCIENCE, INFORMATION SYSTEMS
Jungha Son, Boyoung Kim
{"title":"基于开发者社区的大型语言模型趋势分析:对堆栈溢出的关注","authors":"Jungha Son, Boyoung Kim","doi":"10.3390/info14110602","DOIUrl":null,"url":null,"abstract":"In the rapidly advancing field of large language model (LLM) research, platforms like Stack Overflow offer invaluable insights into the developer community’s perceptions, challenges, and interactions. This research aims to analyze LLM research and development trends within the professional community. Through the rigorous analysis of Stack Overflow, employing a comprehensive dataset spanning several years, the study identifies the prevailing technologies and frameworks underlining the dominance of models and platforms such as Transformer and Hugging Face. Furthermore, a thematic exploration using Latent Dirichlet Allocation unravels a spectrum of LLM discussion topics. As a result of the analysis, twenty keywords were derived, and a total of five key dimensions, “OpenAI Ecosystem and Challenges”, “LLM Training with Frameworks”, “APIs, File Handling and App Development”, “Programming Constructs and LLM Integration”, and “Data Processing and LLM Functionalities”, were identified through intertopic distance mapping. This research underscores the notable prevalence of specific Tags and technologies within the LLM discourse, particularly highlighting the influential roles of Transformer models and frameworks like Hugging Face. This dominance not only reflects the preferences and inclinations of the developer community but also illuminates the primary tools and technologies they leverage in the continually evolving field of LLMs.","PeriodicalId":38479,"journal":{"name":"Information (Switzerland)","volume":null,"pages":null},"PeriodicalIF":2.4000,"publicationDate":"2023-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Trend Analysis of Large Language Models through a Developer Community: A Focus on Stack Overflow\",\"authors\":\"Jungha Son, Boyoung Kim\",\"doi\":\"10.3390/info14110602\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In the rapidly advancing field of large language model (LLM) research, platforms like Stack Overflow offer invaluable insights into the developer community’s perceptions, challenges, and interactions. This research aims to analyze LLM research and development trends within the professional community. Through the rigorous analysis of Stack Overflow, employing a comprehensive dataset spanning several years, the study identifies the prevailing technologies and frameworks underlining the dominance of models and platforms such as Transformer and Hugging Face. Furthermore, a thematic exploration using Latent Dirichlet Allocation unravels a spectrum of LLM discussion topics. As a result of the analysis, twenty keywords were derived, and a total of five key dimensions, “OpenAI Ecosystem and Challenges”, “LLM Training with Frameworks”, “APIs, File Handling and App Development”, “Programming Constructs and LLM Integration”, and “Data Processing and LLM Functionalities”, were identified through intertopic distance mapping. This research underscores the notable prevalence of specific Tags and technologies within the LLM discourse, particularly highlighting the influential roles of Transformer models and frameworks like Hugging Face. This dominance not only reflects the preferences and inclinations of the developer community but also illuminates the primary tools and technologies they leverage in the continually evolving field of LLMs.\",\"PeriodicalId\":38479,\"journal\":{\"name\":\"Information (Switzerland)\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":2.4000,\"publicationDate\":\"2023-11-06\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Information (Switzerland)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.3390/info14110602\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"COMPUTER SCIENCE, INFORMATION SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Information (Switzerland)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3390/info14110602","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 0

摘要

在快速发展的大型语言模型(LLM)研究领域,像Stack Overflow这样的平台为开发人员社区的看法、挑战和互动提供了宝贵的见解。本研究旨在分析法学硕士的研究和发展趋势,在专业社区。通过对Stack Overflow的严格分析,采用跨越数年的综合数据集,该研究确定了主流技术和框架,强调了模型和平台(如Transformer和hugs Face)的主导地位。此外,使用潜在狄利克雷分配的专题探索揭示了法学硕士讨论主题的频谱。分析结果得出了20个关键词,并通过主题间距离映射确定了五个关键维度:“OpenAI生态系统和挑战”、“LLM框架培训”、“api、文件处理和应用程序开发”、“编程结构和LLM集成”和“数据处理和LLM功能”。这项研究强调了法学硕士话语中特定标签和技术的显著流行,特别强调了Transformer模型和框架(如hugs Face)的重要作用。这种主导地位不仅反映了开发人员社区的偏好和倾向,也说明了他们在不断发展的llm领域中所利用的主要工具和技术。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Trend Analysis of Large Language Models through a Developer Community: A Focus on Stack Overflow
In the rapidly advancing field of large language model (LLM) research, platforms like Stack Overflow offer invaluable insights into the developer community’s perceptions, challenges, and interactions. This research aims to analyze LLM research and development trends within the professional community. Through the rigorous analysis of Stack Overflow, employing a comprehensive dataset spanning several years, the study identifies the prevailing technologies and frameworks underlining the dominance of models and platforms such as Transformer and Hugging Face. Furthermore, a thematic exploration using Latent Dirichlet Allocation unravels a spectrum of LLM discussion topics. As a result of the analysis, twenty keywords were derived, and a total of five key dimensions, “OpenAI Ecosystem and Challenges”, “LLM Training with Frameworks”, “APIs, File Handling and App Development”, “Programming Constructs and LLM Integration”, and “Data Processing and LLM Functionalities”, were identified through intertopic distance mapping. This research underscores the notable prevalence of specific Tags and technologies within the LLM discourse, particularly highlighting the influential roles of Transformer models and frameworks like Hugging Face. This dominance not only reflects the preferences and inclinations of the developer community but also illuminates the primary tools and technologies they leverage in the continually evolving field of LLMs.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Information (Switzerland)
Information (Switzerland) Computer Science-Information Systems
CiteScore
6.90
自引率
0.00%
发文量
515
审稿时长
11 weeks
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信