A background perspective on touch as a multimodal (and multisensor) construct

K. Hinckley
{"title":"A background perspective on touch as a multimodal (and multisensor) construct","authors":"K. Hinckley","doi":"10.1145/3015783.3015789","DOIUrl":null,"url":null,"abstract":"This chapter will illustrate, through a series of examples, seven different perspectives of how touch input can be re-framed and re-conceived as a multimodal, multisensor construct. \n \nThese perspectives often can particularly benefit from considering the background of interaction [Buxton 1995]---that is, interaction that takes place \"behind\" the foreground of the user's conscious attention, in response to sensed contextual information. For example, with touch-screen input, the user's intentional contact with the screen would comprise the foreground act, but the resulting vibrational forces imparted to the device can be sensed and leveraged \"in the background\" to infer additional contextual details of the touch. \n \nOver the years, I've found this background perspective extremely useful as a tool-for-thought to devise novel interactions, especially when multiple modalities and multiple sensors can be used simultaneously, in complementary and mutually reenforcing ways. This approach can be especially helpful to break out of whatever preconceptions one might have regarding an input modality, even something as 144 Chapter 4 A Background Perspective on Touch as a Multimodal (and Multisensor) Construct seemingly well studied and well understood as touch, which just for that reason will provide us with the bulk of the examples that I draw from in this chapter. \n \nThese perspectives of touch range from its traditional view as a modality that affords direct and intentional touchscreen input, to the inadvertent (yet still potentially valuable) phenomenon of unintentional (or \"Midas\") touch. We will consider various combinations of touch with other sensor signals such as tilt, inertial motion, grip sensing, and above-screen pre-touch sensing. We'll also discuss interesting ways to use pen and touch as complementary modalities for bimanual interaction. The focus of the chapter is design-centric, with contributions that focus on invention and innovation [Hudson and Mankoff 2014], rather than contributions, say, of formal experimental analysis or recognition methodologies. Likewise, the focus here is on the input side of touch, rather than the output side---as afforded by haptic and tactile feedback technologies (cf. Chapter 3). \n \nThe key point of this chapter is that low-level \"sensing\" channels such as grip and proximity and motion can be conceived of as new modalities that afford natural interaction with devices. And, likewise, it is possible to subtly shift our perspective of \"modalities\" to consider novel ways that they may yield insights of sensing, particularly if considered from the oft-neglected perspective that the background of interaction can support the user's task focus in the foreground of attention most effectively. \n \nTo encourage reflection on the material, the chapter concludes with eight openended Focus Questions that can also serve as starting-points for independent research projects, as well as a Glossary of key terms.","PeriodicalId":222911,"journal":{"name":"The Handbook of Multimodal-Multisensor Interfaces, Volume 1","volume":"16 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-04-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"The Handbook of Multimodal-Multisensor Interfaces, Volume 1","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3015783.3015789","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5

Abstract

This chapter will illustrate, through a series of examples, seven different perspectives of how touch input can be re-framed and re-conceived as a multimodal, multisensor construct. These perspectives often can particularly benefit from considering the background of interaction [Buxton 1995]---that is, interaction that takes place "behind" the foreground of the user's conscious attention, in response to sensed contextual information. For example, with touch-screen input, the user's intentional contact with the screen would comprise the foreground act, but the resulting vibrational forces imparted to the device can be sensed and leveraged "in the background" to infer additional contextual details of the touch. Over the years, I've found this background perspective extremely useful as a tool-for-thought to devise novel interactions, especially when multiple modalities and multiple sensors can be used simultaneously, in complementary and mutually reenforcing ways. This approach can be especially helpful to break out of whatever preconceptions one might have regarding an input modality, even something as 144 Chapter 4 A Background Perspective on Touch as a Multimodal (and Multisensor) Construct seemingly well studied and well understood as touch, which just for that reason will provide us with the bulk of the examples that I draw from in this chapter. These perspectives of touch range from its traditional view as a modality that affords direct and intentional touchscreen input, to the inadvertent (yet still potentially valuable) phenomenon of unintentional (or "Midas") touch. We will consider various combinations of touch with other sensor signals such as tilt, inertial motion, grip sensing, and above-screen pre-touch sensing. We'll also discuss interesting ways to use pen and touch as complementary modalities for bimanual interaction. The focus of the chapter is design-centric, with contributions that focus on invention and innovation [Hudson and Mankoff 2014], rather than contributions, say, of formal experimental analysis or recognition methodologies. Likewise, the focus here is on the input side of touch, rather than the output side---as afforded by haptic and tactile feedback technologies (cf. Chapter 3). The key point of this chapter is that low-level "sensing" channels such as grip and proximity and motion can be conceived of as new modalities that afford natural interaction with devices. And, likewise, it is possible to subtly shift our perspective of "modalities" to consider novel ways that they may yield insights of sensing, particularly if considered from the oft-neglected perspective that the background of interaction can support the user's task focus in the foreground of attention most effectively. To encourage reflection on the material, the chapter concludes with eight openended Focus Questions that can also serve as starting-points for independent research projects, as well as a Glossary of key terms.
触摸作为多模态(和多传感器)结构的背景视角
本章将通过一系列的例子,说明触摸输入如何被重新构建和重新构思为多模态、多传感器结构的七个不同视角。这些观点通常可以特别受益于考虑交互的背景[Buxton 1995]——也就是说,交互发生在“背后”的用户的意识注意的前景,响应感知上下文信息。例如,使用触摸屏输入,用户与屏幕的有意接触将包含前景行为,但传递给设备的振动力可以被“在背景中”感知和利用,以推断触摸的其他上下文细节。多年来,我发现这种背景视角作为设计新颖交互的思考工具非常有用,特别是当多种模式和多个传感器可以同时使用,以互补和相互加强的方式使用时。这种方法尤其有助于打破人们对输入模态的任何先入为主的观念,甚至像第144章第4章一样,触摸作为多模态(和多传感器)结构的背景视角似乎已经被很好地研究和理解为触摸,正是因为这个原因,我将为我们提供我在本章中引用的大部分例子。这些触控视角的范围从传统的提供直接和有意触控输入的模式,到无意(但仍有潜在价值)的无意触控现象。我们将考虑触摸与其他传感器信号的各种组合,如倾斜,惯性运动,抓地力感应和屏幕上的预触摸感应。我们还将讨论一些有趣的方法,使用笔和触摸作为双向交互的补充方式。本章的重点是以设计为中心的,其贡献集中于发明和创新[Hudson and Mankoff 2014],而不是正式的实验分析或识别方法。同样,这里的重点是触摸的输入端,而不是输出端——就像触觉和触觉反馈技术所提供的那样(参见第3章)。本章的关键点是,低级“感应”通道,如抓握、接近和运动,可以被认为是提供与设备自然交互的新模式。同样,我们也可以巧妙地改变我们对“模态”的看法,以考虑它们可能产生感知见解的新方式,特别是如果从经常被忽视的角度考虑,即交互的背景可以最有效地支持用户在关注前景中的任务焦点。为了鼓励对材料的反思,本章以八个开放式焦点问题结束,这些问题也可以作为独立研究项目的起点,以及关键术语的词汇表。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信