在使用大型语言模型的学术研究中,提示完成编辑如何影响用户体验分数?

IF 12.5 1区 社会学 Q1 SOCIAL ISSUES
Gerald Nderitu Njuguna, Min Qingfei
{"title":"在使用大型语言模型的学术研究中,提示完成编辑如何影响用户体验分数?","authors":"Gerald Nderitu Njuguna,&nbsp;Min Qingfei","doi":"10.1016/j.techsoc.2025.103080","DOIUrl":null,"url":null,"abstract":"<div><div>Prompt completion editing (PCE), the user-driven revision of large language model (LLM) completions, is a critical behaviour in the academic applications of multimodal LLMs. However, few studies have examined how these edits function as implicit reinforcement signals to improve LLM alignment and enhance user experience (UX). This study investigates how PCE, conceptualised through the dimensions of language stylistics, personalisation, and labelling, affects UX outcomes, including performance, task management, and user satisfaction. The sample consisted of 294 respondents from China and Kenya. Using a user-centred approach, this study applies partial least squares structural equation modelling for empirical analysis. The results show that PCE significantly improves UX (β = 0.304, t = 3.965, p &lt; 0.001) and acts as a proxy for implicit human feedback in LLM optimisation. Mediation analysis confirms that data management and prompting experience significantly explain the relationships (p &lt; 0.001), whereas simple slope analysis supports the moderation effects of perceived usefulness, task fit, and quality. The findings suggest that user edits serve as fine-grained feedback signals that enhance personalisation and usability in academic contexts. These results inform the design of more flexible and feedback-aware LLM systems, thereby advancing the development of human-in-the-loop artificial intelligence.</div></div>","PeriodicalId":47979,"journal":{"name":"Technology in Society","volume":"84 ","pages":"Article 103080"},"PeriodicalIF":12.5000,"publicationDate":"2025-09-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"How would prompt completion editing impact user experience scores in academic research with large language models?\",\"authors\":\"Gerald Nderitu Njuguna,&nbsp;Min Qingfei\",\"doi\":\"10.1016/j.techsoc.2025.103080\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Prompt completion editing (PCE), the user-driven revision of large language model (LLM) completions, is a critical behaviour in the academic applications of multimodal LLMs. However, few studies have examined how these edits function as implicit reinforcement signals to improve LLM alignment and enhance user experience (UX). This study investigates how PCE, conceptualised through the dimensions of language stylistics, personalisation, and labelling, affects UX outcomes, including performance, task management, and user satisfaction. The sample consisted of 294 respondents from China and Kenya. Using a user-centred approach, this study applies partial least squares structural equation modelling for empirical analysis. The results show that PCE significantly improves UX (β = 0.304, t = 3.965, p &lt; 0.001) and acts as a proxy for implicit human feedback in LLM optimisation. Mediation analysis confirms that data management and prompting experience significantly explain the relationships (p &lt; 0.001), whereas simple slope analysis supports the moderation effects of perceived usefulness, task fit, and quality. The findings suggest that user edits serve as fine-grained feedback signals that enhance personalisation and usability in academic contexts. These results inform the design of more flexible and feedback-aware LLM systems, thereby advancing the development of human-in-the-loop artificial intelligence.</div></div>\",\"PeriodicalId\":47979,\"journal\":{\"name\":\"Technology in Society\",\"volume\":\"84 \",\"pages\":\"Article 103080\"},\"PeriodicalIF\":12.5000,\"publicationDate\":\"2025-09-22\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Technology in Society\",\"FirstCategoryId\":\"90\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0160791X25002702\",\"RegionNum\":1,\"RegionCategory\":\"社会学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"SOCIAL ISSUES\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Technology in Society","FirstCategoryId":"90","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0160791X25002702","RegionNum":1,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"SOCIAL ISSUES","Score":null,"Total":0}
引用次数: 0

摘要

提示补全编辑(PCE)是用户驱动的大型语言模型(LLM)补全的修订,是多模态LLM学术应用中的关键行为。然而,很少有研究检查这些编辑如何作为隐式强化信号来改善LLM对齐和增强用户体验(UX)。本研究探讨了PCE如何通过语言文体学、个性化和标签等维度概念化,影响用户体验结果,包括性能、任务管理和用户满意度。样本包括来自中国和肯尼亚的294名受访者。采用以用户为中心的方法,本研究应用偏最小二乘结构方程模型进行实证分析。结果表明,PCE显著改善了用户体验(β = 0.304, t = 3.965, p < 0.001),并在LLM优化中充当了隐含的人类反馈的代理。中介分析证实,数据管理和提示经验显著解释了关系(p < 0.001),而简单的斜率分析支持感知有用性、任务契合度和质量的调节作用。研究结果表明,用户编辑作为一种细粒度的反馈信号,可以增强学术环境中的个性化和可用性。这些结果为设计更灵活和反馈感知的LLM系统提供了信息,从而推动了人在环人工智能的发展。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
How would prompt completion editing impact user experience scores in academic research with large language models?
Prompt completion editing (PCE), the user-driven revision of large language model (LLM) completions, is a critical behaviour in the academic applications of multimodal LLMs. However, few studies have examined how these edits function as implicit reinforcement signals to improve LLM alignment and enhance user experience (UX). This study investigates how PCE, conceptualised through the dimensions of language stylistics, personalisation, and labelling, affects UX outcomes, including performance, task management, and user satisfaction. The sample consisted of 294 respondents from China and Kenya. Using a user-centred approach, this study applies partial least squares structural equation modelling for empirical analysis. The results show that PCE significantly improves UX (β = 0.304, t = 3.965, p < 0.001) and acts as a proxy for implicit human feedback in LLM optimisation. Mediation analysis confirms that data management and prompting experience significantly explain the relationships (p < 0.001), whereas simple slope analysis supports the moderation effects of perceived usefulness, task fit, and quality. The findings suggest that user edits serve as fine-grained feedback signals that enhance personalisation and usability in academic contexts. These results inform the design of more flexible and feedback-aware LLM systems, thereby advancing the development of human-in-the-loop artificial intelligence.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
17.90
自引率
14.10%
发文量
316
审稿时长
60 days
期刊介绍: Technology in Society is a global journal dedicated to fostering discourse at the crossroads of technological change and the social, economic, business, and philosophical transformation of our world. The journal aims to provide scholarly contributions that empower decision-makers to thoughtfully and intentionally navigate the decisions shaping this dynamic landscape. A common thread across these fields is the role of technology in society, influencing economic, political, and cultural dynamics. Scholarly work in Technology in Society delves into the social forces shaping technological decisions and the societal choices regarding technology use. This encompasses scholarly and theoretical approaches (history and philosophy of science and technology, technology forecasting, economic growth, and policy, ethics), applied approaches (business innovation, technology management, legal and engineering), and developmental perspectives (technology transfer, technology assessment, and economic development). Detailed information about the journal's aims and scope on specific topics can be found in Technology in Society Briefings, accessible via our Special Issues and Article Collections.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信