引导式评估方法:一种更简便的方法,可根据经验估算训练有素的用户在使用不熟悉的键盘布局时的表现

IF 5.3 2区 计算机科学 Q1 COMPUTER SCIENCE, CYBERNETICS
Aunnoy K Mutasim , Anil Ufuk Batmaz , Moaaz Hudhud Mughrabi , Wolfgang Stuerzlinger
{"title":"引导式评估方法:一种更简便的方法,可根据经验估算训练有素的用户在使用不熟悉的键盘布局时的表现","authors":"Aunnoy K Mutasim ,&nbsp;Anil Ufuk Batmaz ,&nbsp;Moaaz Hudhud Mughrabi ,&nbsp;Wolfgang Stuerzlinger","doi":"10.1016/j.ijhcs.2024.103317","DOIUrl":null,"url":null,"abstract":"<div><p>To determine in a user study whether proposed keyboard layouts, such as OPTI, can surpass QWERTY in performance, extended training through longitudinal studies is crucial. However, addressing the challenge of creating trained users presents a logistical bottleneck. A common alternative involves having participants type the same word or phrase repeatedly. We conducted two separate studies to investigate this alternative. The findings reveal that both approaches, repeatedly typing words or phrases, have limitations in accurately estimating trained user performance. Thus, we propose the Guided Evaluation Method (GEM), a novel approach to <em>quickly</em> estimate trained user performance with novices. Our results reveal that in a matter of minutes, participants exhibited performance similar to an existing longitudinal study — OPTI outperforms QWERTY. As it eliminates the need for resource-intensive longitudinal studies, our new GEM thus enables much faster estimation of trained user performance. This outcome will potentially reignite research on better text entry methods.</p></div>","PeriodicalId":54955,"journal":{"name":"International Journal of Human-Computer Studies","volume":null,"pages":null},"PeriodicalIF":5.3000,"publicationDate":"2024-06-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"The Guided Evaluation Method: An easier way to empirically estimate trained user performance for unfamiliar keyboard layouts\",\"authors\":\"Aunnoy K Mutasim ,&nbsp;Anil Ufuk Batmaz ,&nbsp;Moaaz Hudhud Mughrabi ,&nbsp;Wolfgang Stuerzlinger\",\"doi\":\"10.1016/j.ijhcs.2024.103317\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>To determine in a user study whether proposed keyboard layouts, such as OPTI, can surpass QWERTY in performance, extended training through longitudinal studies is crucial. However, addressing the challenge of creating trained users presents a logistical bottleneck. A common alternative involves having participants type the same word or phrase repeatedly. We conducted two separate studies to investigate this alternative. The findings reveal that both approaches, repeatedly typing words or phrases, have limitations in accurately estimating trained user performance. Thus, we propose the Guided Evaluation Method (GEM), a novel approach to <em>quickly</em> estimate trained user performance with novices. Our results reveal that in a matter of minutes, participants exhibited performance similar to an existing longitudinal study — OPTI outperforms QWERTY. As it eliminates the need for resource-intensive longitudinal studies, our new GEM thus enables much faster estimation of trained user performance. This outcome will potentially reignite research on better text entry methods.</p></div>\",\"PeriodicalId\":54955,\"journal\":{\"name\":\"International Journal of Human-Computer Studies\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":5.3000,\"publicationDate\":\"2024-06-13\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International Journal of Human-Computer Studies\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S1071581924001010\",\"RegionNum\":2,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, CYBERNETICS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Human-Computer Studies","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1071581924001010","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, CYBERNETICS","Score":null,"Total":0}
引用次数: 0

摘要

要在用户研究中确定所建议的键盘布局(如 OPTI)是否能在性能上超越 QWERTY,通过纵向研究进行扩展培训至关重要。然而,如何培养训练有素的用户是一个后勤瓶颈。一种常见的替代方法是让参与者重复键入相同的单词或短语。我们分别进行了两项研究来探讨这种替代方法。研究结果表明,重复键入单词或短语这两种方法在准确估计训练有素的用户性能方面都存在局限性。因此,我们提出了 "引导评估法"(GEM),这是一种通过新手快速估算受训用户性能的新方法。我们的结果表明,在短短几分钟内,参与者就表现出了与现有纵向研究类似的性能--OPTI 优于 QWERTY。由于无需进行资源密集型的纵向研究,我们的新 GEM 能够更快地估算受训用户的性能。这一成果有可能重新激发对更好的文本输入方法的研究。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
The Guided Evaluation Method: An easier way to empirically estimate trained user performance for unfamiliar keyboard layouts

To determine in a user study whether proposed keyboard layouts, such as OPTI, can surpass QWERTY in performance, extended training through longitudinal studies is crucial. However, addressing the challenge of creating trained users presents a logistical bottleneck. A common alternative involves having participants type the same word or phrase repeatedly. We conducted two separate studies to investigate this alternative. The findings reveal that both approaches, repeatedly typing words or phrases, have limitations in accurately estimating trained user performance. Thus, we propose the Guided Evaluation Method (GEM), a novel approach to quickly estimate trained user performance with novices. Our results reveal that in a matter of minutes, participants exhibited performance similar to an existing longitudinal study — OPTI outperforms QWERTY. As it eliminates the need for resource-intensive longitudinal studies, our new GEM thus enables much faster estimation of trained user performance. This outcome will potentially reignite research on better text entry methods.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
International Journal of Human-Computer Studies
International Journal of Human-Computer Studies 工程技术-计算机:控制论
CiteScore
11.50
自引率
5.60%
发文量
108
审稿时长
3 months
期刊介绍: The International Journal of Human-Computer Studies publishes original research over the whole spectrum of work relevant to the theory and practice of innovative interactive systems. The journal is inherently interdisciplinary, covering research in computing, artificial intelligence, psychology, linguistics, communication, design, engineering, and social organization, which is relevant to the design, analysis, evaluation and application of innovative interactive systems. Papers at the boundaries of these disciplines are especially welcome, as it is our view that interdisciplinary approaches are needed for producing theoretical insights in this complex area and for effective deployment of innovative technologies in concrete user communities. Research areas relevant to the journal include, but are not limited to: • Innovative interaction techniques • Multimodal interaction • Speech interaction • Graphic interaction • Natural language interaction • Interaction in mobile and embedded systems • Interface design and evaluation methodologies • Design and evaluation of innovative interactive systems • User interface prototyping and management systems • Ubiquitous computing • Wearable computers • Pervasive computing • Affective computing • Empirical studies of user behaviour • Empirical studies of programming and software engineering • Computer supported cooperative work • Computer mediated communication • Virtual reality • Mixed and augmented Reality • Intelligent user interfaces • Presence ...
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信