Adjunct Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology最新文献

筛选
英文 中文
Detecting student frustration based on handwriting behavior 通过书写行为检测学生的挫败感
H. Asai, H. Yamana
{"title":"Detecting student frustration based on handwriting behavior","authors":"H. Asai, H. Yamana","doi":"10.1145/2508468.2514718","DOIUrl":"https://doi.org/10.1145/2508468.2514718","url":null,"abstract":"Detecting states of frustration among students engaged in learning activities is critical to the success of teaching assistance tools. We examine the relationship between a student's pen activity and his/her state of frustration while solving handwritten problems. Based on a user study involving mathematics problems, we found that our detection method was able to detect student frustration with a precision of 87% and a recall of 90%. We also identified several particularly discriminative features, including writing stroke number, erased stroke number, pen activity time, and air stroke speed.","PeriodicalId":196872,"journal":{"name":"Adjunct Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2013-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115228063","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 13
Flexkit: a rapid prototyping platform for flexible displays Flexkit:用于柔性显示器的快速原型设计平台
David Holman, Jesse Burstyn, R. Brotman, A. Younkin, Roel Vertegaal
{"title":"Flexkit: a rapid prototyping platform for flexible displays","authors":"David Holman, Jesse Burstyn, R. Brotman, A. Younkin, Roel Vertegaal","doi":"10.1145/2508468.2514934","DOIUrl":"https://doi.org/10.1145/2508468.2514934","url":null,"abstract":"Commercially available development platforms for flexible displays are not designed for rapid prototyping. To create a deformable interface, one that uses a functional flexible display, designers must be familiar with embedded hardware systems and corresponding programming. We introduce Flexkit, a platform that allows designers to rapidly prototype deformable applications. With Flexkit, designers can rapidly prototype using a thin-film electrophoretic display, one that is \"Plug and Play\". To demonstrate Flexkit's ease-of-use, we present its application in PaperTab's design iteration as a case study. We further discuss how dithering can be used to increase the frame rate of electrophoretic displays from 1fps to 5fps.","PeriodicalId":196872,"journal":{"name":"Adjunct Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2013-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115346546","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 16
QOOK: a new physical-virtual coupling experience for active reading 主动阅读的一种新的物理-虚拟耦合体验
Yuhang Zhao, Yongqiang Qin, Yang Liu, Siqi Liu, Yuanchun Shi
{"title":"QOOK: a new physical-virtual coupling experience for active reading","authors":"Yuhang Zhao, Yongqiang Qin, Yang Liu, Siqi Liu, Yuanchun Shi","doi":"10.1145/2508468.2514928","DOIUrl":"https://doi.org/10.1145/2508468.2514928","url":null,"abstract":"We present QOOK, an interactive reading system that incorporates the benefits of both physical and digital books to facilitate active reading. QOOK uses a top-projector to create digital contents on a blank paper book. By detecting markers attached to each page, QOOK allows users to flip pages just like they would with a real book. Electronic functions such as keyword searching, highlighting and bookmarking are included to provide users with additional digital assistance. With a Kinect sensor that recognizes touch gestures, QOOK enables people to use these electronic functions directly with their fingers. The combination of the electronic functions of the virtual interface and free-form interaction with the physical book creates a natural reading experience, providing an opportunity for faster navigation between pages and better understanding of the book contents.","PeriodicalId":196872,"journal":{"name":"Adjunct Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2013-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116733287","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 8
Adjunct Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology 第26届ACM用户界面软件与技术年会附文集
S. Izadi, A. Quigley, I. Poupyrev, T. Igarashi
{"title":"Adjunct Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology","authors":"S. Izadi, A. Quigley, I. Poupyrev, T. Igarashi","doi":"10.1145/2508468","DOIUrl":"https://doi.org/10.1145/2508468","url":null,"abstract":"It is our pleasure to welcome you to the 26th Annual ACM Symposium on User Interface Software and Technology (UIST) 2013, held from October 8-11th, in the historic town and University of St Andrews, Scotland, United Kingdom. \u0000 \u0000UIST is the premier forum for the presentation of research innovations in the software and technology of human-computer interfaces. Sponsored by ACM's special interest groups on computer-human interaction (SIGCHI) and computer graphics (SIGGRAPH), UIST brings together researchers and practitioners from many areas, including web and graphical interfaces, new input and output devices, information visualization, sensing technologies, interactive displays, tabletop and tangible computing, interaction techniques, augmented and virtual reality, ubiquitous computing, and computer supported cooperative work. The single-track program and intimate size, makes UIST 2013 an ideal place to exchange results at the cutting edge of user interfaces research, to meet friends and colleagues, and to forge future collaborations. \u0000 \u0000We received a record 317 paper submissions from more than 30 countries. After a thorough review process, the program committee accepted 62 papers (19.5%). Each anonymous submission was first reviewed by three external reviewers, and meta-reviews were provided by two program committee members. If any of the five reviewers deemed a submission to pass a rejection threshold we asked the authors to submit a short rebuttal addressing the reviewers' concerns. The program committee met in person in Pittsburgh, PA, on May 30-31, 2013, to select the papers for the conference. Submissions were finally accepted only after the authors provided a final revision addressing the committee's comments. \u0000 \u0000In addition to the presentations of accepted papers, this year's program includes a keynote by Raffaello D'Andrea (ETH Zurich) on feedback control systems for autonomous machines. A great line up of posters, demos, (the ninth) annual Doctoral Symposium, and (the fifth) annual Student Innovation Contest (this year focusing on programmable water pumps called Pumpspark) complete the program. We hope you enjoy all aspects of the UIST 2013 program, and that you get to enjoy our wonderful venues and that your discussions and interactions prove fruitful.","PeriodicalId":196872,"journal":{"name":"Adjunct Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2013-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128831516","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Identifying emergent behaviours from longitudinal web use 从纵向网络使用中识别紧急行为
Aitor Apaolaza
{"title":"Identifying emergent behaviours from longitudinal web use","authors":"Aitor Apaolaza","doi":"10.1145/2508468.2508475","DOIUrl":"https://doi.org/10.1145/2508468.2508475","url":null,"abstract":"Laboratory studies present difficulties in the understanding of how usage evolves over time. Employed observations are obtrusive and not naturalistic. Our system employs a remote capture tool that provides longitudinal low-level interaction data. It is easily deployable into any Web site allowing deployments in-the-wild and is completely unobtrusive. Web application interfaces are designed assuming users' goals. Requirement specifications contain well defined use cases and scenarios that drive design and subsequent optimisations. Users' interaction patterns outside the expected ones are not considered. This results in an optimisation for a stylised user rather than a real one. A bottom-up analysis from low-level interaction data makes possible the emergence of users' tasks. Similarities among users can be found and solutions that are effective for real users can be designed. Factors such as learnability and how interface changes affect users are difficult to observe in laboratory studies. Our solution makes it possible, adding a longitudinal point of view to traditional laboratory studies. The capture tool is deployed in real world Web applications capturing in-situ data from users. These data serve to explore analysis and visualisation possibilities. We present an example of the exploration results with one Web application.","PeriodicalId":196872,"journal":{"name":"Adjunct Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2013-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127087853","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
BackTap: robust four-point tapping on the back of an off-the-shelf smartphone BackTap:在现成的智能手机背面进行有力的四点敲击
Cheng Zhang, Aman Parnami, Caleb Southern, Edison Thomaz, Gabriel Reyes, R. Arriaga, G. Abowd
{"title":"BackTap: robust four-point tapping on the back of an off-the-shelf smartphone","authors":"Cheng Zhang, Aman Parnami, Caleb Southern, Edison Thomaz, Gabriel Reyes, R. Arriaga, G. Abowd","doi":"10.1145/2508468.2514735","DOIUrl":"https://doi.org/10.1145/2508468.2514735","url":null,"abstract":"We present BackTap, an interaction technique that extends the input modality of a smartphone to add four distinct tap locations on the back case of a smartphone. The BackTap interaction can be used eyes-free with the phone in a user's pocket, purse, or armband while walking, or while holding the phone with two hands so as not to occlude the screen with the fingers. We employ three common built-in sensors on the smartphone (microphone, gyroscope, and accelerometer) and feature a lightweight heuristic implementation. In an evaluation with eleven participants and three usage conditions, users were able to tap four distinct points with 92% to 96% accuracy.","PeriodicalId":196872,"journal":{"name":"Adjunct Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2013-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121546367","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 17
Enabling an ecosystem of personal behavioral data 建立个人行为数据生态系统
Jason Wiese
{"title":"Enabling an ecosystem of personal behavioral data","authors":"Jason Wiese","doi":"10.1145/2508468.2508472","DOIUrl":"https://doi.org/10.1145/2508468.2508472","url":null,"abstract":"Almost every computational system a person interacts with keeps a detailed log of that person's behavior. The possibility of this data promises a breadth of new service opportunities for improving people's lives through deep personalization, tools to manage aspects of their personal wellbeing, and services that support identity construction. However, the way that this data is collected and managed today introduces several challenges that severely limit the utility of this rich data. This thesis maps out a computational ecosystem for personal behavioral data through the design, implementation, and evaluation of Phenom, a web service that factors out common activities in making inferences from personal behavioral data. The primary benefits of Phenom include: a structured process for aggregating and representing user data; support for developing models based on personal behavioral data; and a unified API for accessing inferences made by models within Phenom. To evaluate Phenom for ease of use and versatility, an external set of developers will create example applications with it.","PeriodicalId":196872,"journal":{"name":"Adjunct Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2013-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127840900","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
A touchless passive infrared gesture sensor 一种非接触式被动红外手势传感器
Piotr Wojtczuk, T. David Binnie, A. Armitage, T. Chamberlain, C. Giebeler
{"title":"A touchless passive infrared gesture sensor","authors":"Piotr Wojtczuk, T. David Binnie, A. Armitage, T. Chamberlain, C. Giebeler","doi":"10.1145/2508468.2514713","DOIUrl":"https://doi.org/10.1145/2508468.2514713","url":null,"abstract":"A sensing device for a touchless, hand gesture, user interface based on an inexpensive passive infrared pyroelectric detector array is presented. The 2 x 2 element sensor responds to changing infrared radiation generated by hand movement over the array. The sensing range is from a few millimetres to tens of centimetres. The low power consumption (< 50 μW) enables the sensor's use in mobile devices and in low energy applications. Detection rates of 77% have been demonstrated using a prototype system that differentiates the four main hand motion trajectories -- up, down, left and right. This device allows greater non-contact control capability without an increase in size, cost or power consumption over existing on/off devices.","PeriodicalId":196872,"journal":{"name":"Adjunct Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2013-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116661604","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 11
A cluster information navigate method by gaze tracking 一种基于注视跟踪的聚类信息导航方法
Dawei Cheng, Danqiong Li, Liang Fang
{"title":"A cluster information navigate method by gaze tracking","authors":"Dawei Cheng, Danqiong Li, Liang Fang","doi":"10.1145/2508468.2514710","DOIUrl":"https://doi.org/10.1145/2508468.2514710","url":null,"abstract":"According to the rapid growth of data volume, it's increasingly complicated to present and navigate large amount of data in a convenient method on mobile devices with a small screen. To address this challenge, we present a new method which displays cluster information in a hierarchy pattern and interact with them by eyes' movement captured by the front camera of mobile devices. The key of this system is providing users a new interacting method to navigate and select data quickly by eyes without any additional equipment.","PeriodicalId":196872,"journal":{"name":"Adjunct Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2013-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123253142","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Sensor design and interaction techniques for gestural input to smart glasses and mobile devices 智能眼镜和移动设备手势输入的传感器设计和交互技术
Andrea Colaco
{"title":"Sensor design and interaction techniques for gestural input to smart glasses and mobile devices","authors":"Andrea Colaco","doi":"10.1145/2508468.2508474","DOIUrl":"https://doi.org/10.1145/2508468.2508474","url":null,"abstract":"Touchscreen interfaces for small display devices have several limitations: the act of touching the screen occludes the display, interface elements like keyboards consume precious display real estate, and even simple tasks like document navigation - which the user performs effortlessly using a mouse and keyboard - require repeated actions like pinch-and-zoom with touch input. More recently, smart glasses with limited or no touch input are starting to emerge commercially. However, the primary input to these systems has been voice. In this paper, we explore the space around the device as a means of touchless gestural input to devices with small or no displays. Capturing gestural input in the surrounding volume requires sensing the human hand. To achieve gestural input we have built Mime [3] -- a compact, low-power 3D sensor for short-range gestural control of small display devices. Our sensor is based on a novel signal processing pipeline and is built using standard off-the-shelf components. Using Mime we demonstrated a variety of application scenarios including 3D spatial input using close-range gestures, gaming, on-the-move interaction, and operation in cluttered environments and in broad daylight conditions. In my thesis, I will continue to extend sensor capabilities to support new interaction styles.","PeriodicalId":196872,"journal":{"name":"Adjunct Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2013-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130698568","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 8
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信