CHI '13 Extended Abstracts on Human Factors in Computing Systems最新文献

筛选
英文 中文
Multi-modal location-aware system for paratrooper team coordination 伞兵小组协调的多模式位置感知系统
CHI '13 Extended Abstracts on Human Factors in Computing Systems Pub Date : 2013-04-27 DOI: 10.1145/2468356.2468779
Danielle Cummings, Manoj Prasad, G. Lucchese, C. Aikens, T. Hammond
{"title":"Multi-modal location-aware system for paratrooper team coordination","authors":"Danielle Cummings, Manoj Prasad, G. Lucchese, C. Aikens, T. Hammond","doi":"10.1145/2468356.2468779","DOIUrl":"https://doi.org/10.1145/2468356.2468779","url":null,"abstract":"Navigation and assembly are critical tasks for Soldiers in battlefield situations. Paratroopers, in particular, must be able to parachute into a battlefield and locate and assemble their equipment as quickly and quietly as possible. Current assembly methods rely on bulky and antiquated equipment that inhibit the speed and effectiveness of such operations. To address this we have created a multi-modal mobile navigation system that uses ruggedized to mark assembly points and smartphones to assist in navigating to these points while minimizing cognitive load and maximizing situational awareness. To achieve this task, we implemented a novel beacon receiver protocol that allows an infinite number of receivers to listen to the encrypted beaconing message using only ad-hoc Wi-Fi technologies. The system was evaluated by U.S. Army Paratroopers and proved quick to learn and efficient at moving Soldiers to navigation waypoints. Beyond military operations, this system could be applied to any task that requires the assembly and coordination of many individuals or teams, such as emergency evacuations, fighting wildfires or locating airdropped humanitarian aid.","PeriodicalId":228717,"journal":{"name":"CHI '13 Extended Abstracts on Human Factors in Computing Systems","volume":"53 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130658457","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
UnoJoy!: a library for rapid video game prototyping using arduino UnoJoy !:一个使用arduino的快速视频游戏原型库
CHI '13 Extended Abstracts on Human Factors in Computing Systems Pub Date : 2013-04-27 DOI: 10.1145/2468356.2479512
Alan D. Chatham, W. Walmink, F. Mueller
{"title":"UnoJoy!: a library for rapid video game prototyping using arduino","authors":"Alan D. Chatham, W. Walmink, F. Mueller","doi":"10.1145/2468356.2479512","DOIUrl":"https://doi.org/10.1145/2468356.2479512","url":null,"abstract":"UnoJoy! is a free, open-source library for the Arduino Uno platform allowing users to rapidly prototype system-native video game controllers. Using standard Arduino code, users assign inputs to button presses, and then the user can run a program to overwrite the Arduino firmware, allowing the Arduino to register as a native game controller for Windows, OSX, and Playstation 3. Focusing on ease of use, the library allows researchers and interaction designers to quickly experiment with novel interaction methods while using high-quality commercial videogames. In our practice, we have used it to add exertion-based controls to existing games and to explore how different controllers can affect the social experience of video games. We hope this tool can help other researchers and designers deepen our understanding of game interaction mechanics by making controller design simple.","PeriodicalId":228717,"journal":{"name":"CHI '13 Extended Abstracts on Human Factors in Computing Systems","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124202584","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
Posture training with real-time visual feedback 姿势训练与实时视觉反馈
CHI '13 Extended Abstracts on Human Factors in Computing Systems Pub Date : 2013-04-27 DOI: 10.1145/2468356.2479629
Brett Taylor, M. Birk, R. Mandryk, Z. Ivkovic
{"title":"Posture training with real-time visual feedback","authors":"Brett Taylor, M. Birk, R. Mandryk, Z. Ivkovic","doi":"10.1145/2468356.2479629","DOIUrl":"https://doi.org/10.1145/2468356.2479629","url":null,"abstract":"Our posture affects us in a number of surprising and unexpected ways, by influencing how we handle stress and how confident we feel. But it is difficult for people to main good posture. We present a non-invasive posture training system using an Xbox Kinect sensor. We provide real-time visual feedback at two levels of fidelity.","PeriodicalId":228717,"journal":{"name":"CHI '13 Extended Abstracts on Human Factors in Computing Systems","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123521811","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 12
Dynamic duo: phone-tablet interaction on tabletops 动态组合:手机和平板电脑在台式机上的交互
CHI '13 Extended Abstracts on Human Factors in Computing Systems Pub Date : 2013-04-27 DOI: 10.1145/2468356.2479520
T. Piazza, Shengdong Zhao, Gonzalo A. Ramos, A. Yantaç, M. Fjeld
{"title":"Dynamic duo: phone-tablet interaction on tabletops","authors":"T. Piazza, Shengdong Zhao, Gonzalo A. Ramos, A. Yantaç, M. Fjeld","doi":"10.1145/2468356.2479520","DOIUrl":"https://doi.org/10.1145/2468356.2479520","url":null,"abstract":"As an increasing number of users carry smartphones and tablets simultaneously, there is an opportunity to leverage the use of these two form factors in a more complementary way. Our work aims to explore this by a) defining the design space of distributed input and output solutions that rely on and benefit from phone- tablet combinations working together physically and digitally; and b) reveal the idiosyncrasies of each particular device combination via interactive prototypes. Our research provides actionable insight in this emerging area by defining a design space, suggesting a mobile framework, and implementing prototypical applications in such areas as distributed information display, distributed control, and combinations of these. For each of these, we show a few example techniques and demonstrate an application combining more techniques.","PeriodicalId":228717,"journal":{"name":"CHI '13 Extended Abstracts on Human Factors in Computing Systems","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123532959","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 7
VISO: a shared, formal knowledge base as a foundation for semi-automatic infovis systems VISO:一个共享的、正式的知识库,作为半自动信息系统的基础
CHI '13 Extended Abstracts on Human Factors in Computing Systems Pub Date : 2013-04-27 DOI: 10.1145/2468356.2468677
Jan Polowinski, M. Voigt
{"title":"VISO: a shared, formal knowledge base as a foundation for semi-automatic infovis systems","authors":"Jan Polowinski, M. Voigt","doi":"10.1145/2468356.2468677","DOIUrl":"https://doi.org/10.1145/2468356.2468677","url":null,"abstract":"Interactive visual analytic systems can help to solve the problem of identifying relevant information in the growing amount of data. For guiding the user through visualization tasks, these semi-automatic systems need to store and use knowledge of this interdisciplinary domain. Unfortunately, visualisation knowledge stored in one system cannot easily be reused in another due to a lack of shared formal models. In order to approach this problem, we introduce a visualization ontology (VISO) that formally models visualization-specific concepts and facts. Furthermore, we give first examples of the ontology's use within two systems and highlight how the community can get involved in extending and improving it.","PeriodicalId":228717,"journal":{"name":"CHI '13 Extended Abstracts on Human Factors in Computing Systems","volume":"61 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114078863","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 19
TouchShield: a virtual control for stable grip of a smartphone using the thumb TouchShield:一种虚拟控制,可以用拇指稳定地握持智能手机
CHI '13 Extended Abstracts on Human Factors in Computing Systems Pub Date : 2013-04-27 DOI: 10.1145/2468356.2468589
Jonggi Hong, Geehyuk Lee
{"title":"TouchShield: a virtual control for stable grip of a smartphone using the thumb","authors":"Jonggi Hong, Geehyuk Lee","doi":"10.1145/2468356.2468589","DOIUrl":"https://doi.org/10.1145/2468356.2468589","url":null,"abstract":"People commonly manipulate their smartphones using the thumb, but this is often done with an unstable grip in which the phone lays on their fingers, while the thumb hovers over the touch screen. In order to offer a secure and stable grip, we designed a virtual control called TouchShield, which provides place in which the thumb can pin the phone down in order to provide a stable grip. In a user study, we confirmed that this form of control does not interfere with existing touch screen operations, and the possibility that TouchShield can make more stable grip. An incidental function of TouchShield is that it provides shortcuts to frequently used commands via the thumb, a function that was also shown to be effective in the user study.","PeriodicalId":228717,"journal":{"name":"CHI '13 Extended Abstracts on Human Factors in Computing Systems","volume":"77 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114083645","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 9
The future of personal video communication: moving beyond talking heads to shared experiences 个人视频通信的未来:超越对话头到共享体验
CHI '13 Extended Abstracts on Human Factors in Computing Systems Pub Date : 2013-04-27 DOI: 10.1145/2468356.2479658
Erick Oduor, Carman Neustaedter, Gina Venolia, Tejinder K. Judge
{"title":"The future of personal video communication: moving beyond talking heads to shared experiences","authors":"Erick Oduor, Carman Neustaedter, Gina Venolia, Tejinder K. Judge","doi":"10.1145/2468356.2479658","DOIUrl":"https://doi.org/10.1145/2468356.2479658","url":null,"abstract":"Personal video communication systems such as Skype or FaceTime are starting to become a common tool used by family and friends to communicate and interact over distance. Yet many are designed to only support conversation with a focus on display 'talking heads'. In this workshop, we want to discuss the opportunities and challenges in moving beyond this design paradigm to one where personal video communication systems can be used to share everyday experiences. By this we are referring to systems that might support shared dinners, shared television watching, or even remote participation in events such as weddings, parties, or graduations. This list could go on and on as the future of personal video communications is ripe for explorations and discussions.","PeriodicalId":228717,"journal":{"name":"CHI '13 Extended Abstracts on Human Factors in Computing Systems","volume":"18 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114778029","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 14
Smarter objects: using AR technology to program physical objects and their interactions 更智能的对象:使用AR技术对物理对象及其交互进行编程
CHI '13 Extended Abstracts on Human Factors in Computing Systems Pub Date : 2013-04-27 DOI: 10.1145/2468356.2468528
Valentin Heun, Shunichi Kasahara, P. Maes
{"title":"Smarter objects: using AR technology to program physical objects and their interactions","authors":"Valentin Heun, Shunichi Kasahara, P. Maes","doi":"10.1145/2468356.2468528","DOIUrl":"https://doi.org/10.1145/2468356.2468528","url":null,"abstract":"The Smarter Objects system explores a new method for interaction with everyday objects. The system associates a virtual object with every physical object to support an easy means of modifying the interface and the behavior of that physical object as well as its interactions with other \"smarter objects\". As a user points a smart phone or tablet at a physical object, an augmented reality (AR) application recognizes the object and offers an intuitive graphical interface to program the object's behavior and interactions with other objects. Once reprogrammed, the Smarter Object can then be operated with a simple tangible interface (such as knobs, buttons, etc). As such Smarter Objects combine the adaptability of digital objects with the simple tangible interface of a physical object. We have implemented several Smarter Objects and usage scenarios demonstrating the potential of this approach.","PeriodicalId":228717,"journal":{"name":"CHI '13 Extended Abstracts on Human Factors in Computing Systems","volume":"77 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127627204","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 85
Enhancing visuospatial attention performance with brain-computer interfaces 脑机接口增强视觉空间注意力表现
CHI '13 Extended Abstracts on Human Factors in Computing Systems Pub Date : 2013-04-27 DOI: 10.1145/2468356.2468579
R. Trachel, T. Brochier, Maureen Clerc
{"title":"Enhancing visuospatial attention performance with brain-computer interfaces","authors":"R. Trachel, T. Brochier, Maureen Clerc","doi":"10.1145/2468356.2468579","DOIUrl":"https://doi.org/10.1145/2468356.2468579","url":null,"abstract":"Visuospatial attention is often investigated with features related to the head or the gaze during Human-Computer Interaction (HCI). However the focus of attention can be dissociated from overt responses such as eye movements, and impossible to detect from behavioral data. Actually, Electroencephalography (EEG) can also provide valuable information about covert aspects of spatial attention. Therefore we propose a innovative approach in view of developping a Brain-Computer Interface (BCI) to enhance human reaction speed and accuracy. This poster presents an offline evaluation of the approach based on physiological data recorded in a visuospatial attention experiment. Finally we discuss about the future interface that could enhance HCI by displaying visual information at the focus of attention.","PeriodicalId":228717,"journal":{"name":"CHI '13 Extended Abstracts on Human Factors in Computing Systems","volume":"35 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127636104","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 8
HCI with sports HCI与体育
CHI '13 Extended Abstracts on Human Factors in Computing Systems Pub Date : 2013-04-27 DOI: 10.1145/2468356.2468817
F. Mueller, R. A. Khot, Alan D. Chatham, S. Pijnappel, Cagdas Toprak, Joe Marshall
{"title":"HCI with sports","authors":"F. Mueller, R. A. Khot, Alan D. Chatham, S. Pijnappel, Cagdas Toprak, Joe Marshall","doi":"10.1145/2468356.2468817","DOIUrl":"https://doi.org/10.1145/2468356.2468817","url":null,"abstract":"Recent advances in cheap sensor technology has made technology support for sports and physical exercise increasingly commonplace, which is evident from the growing popularity of heart rate monitors and GPS sports watches. This rise of technology to support sports activities raises many interaction issues, such as how to interact with these devices while moving and physically exerting. This special interest group brings together industry practitioners and researchers who are interested in designing and understanding human-computer interaction where the human is being physically active, engaging in exertion activities. Fitting with the theme, this special interest group will be \"run\" while running: participants will be invited to a jog together during which we will discuss technology interaction that is specific to being physically active whilst being physically active ourselves.","PeriodicalId":228717,"journal":{"name":"CHI '13 Extended Abstracts on Human Factors in Computing Systems","volume":"25 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126453658","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 19
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信