SIGGRAPH Asia 2014 Emerging Technologies最新文献

筛选
英文 中文
MR coral sea: mixed reality aquarium with physical MR displays MR珊瑚海:带有物理MR显示器的混合现实水族馆
SIGGRAPH Asia 2014 Emerging Technologies Pub Date : 2014-11-24 DOI: 10.1145/2669047.2669051
Toshikazu Ohshima, Chiharu Tanaka
{"title":"MR coral sea: mixed reality aquarium with physical MR displays","authors":"Toshikazu Ohshima, Chiharu Tanaka","doi":"10.1145/2669047.2669051","DOIUrl":"https://doi.org/10.1145/2669047.2669051","url":null,"abstract":"MR Coral Sea is a mixed-reality (MR) aquarium using which a user can play with virtual fish via a Coral Display, which is an MR display device with physical feedback. In response to hand movements, the virtual fish decides its behavior. The device provides physical feedback using illumination and tactile and auditory sensation to the user.","PeriodicalId":118940,"journal":{"name":"SIGGRAPH Asia 2014 Emerging Technologies","volume":"29 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-11-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125193393","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Touch at a distance: simple perception aid device with user's explorer action 远距离触摸:简单的感知辅助设备与用户的浏览器操作
SIGGRAPH Asia 2014 Emerging Technologies Pub Date : 2014-11-24 DOI: 10.1145/2669047.2669058
J. Akita, T. Ono, Kiyohide Ito, M. Okamoto
{"title":"Touch at a distance: simple perception aid device with user's explorer action","authors":"J. Akita, T. Ono, Kiyohide Ito, M. Okamoto","doi":"10.1145/2669047.2669058","DOIUrl":"https://doi.org/10.1145/2669047.2669058","url":null,"abstract":"Although we obtain a lot of information in our environment via the visual modality, we also obtain rich information via the non-visual modality. In the mechanism how we perceive our environment, we use not only the sensor information, but also \"how it changes according to how we act.\" For example, we obtain the haptic information from the haptic sensor on our finger, and when we move our finger along to the surface of the touching object, the haptic information changes according to the finger motion, and we \"perceive\" the whole shape of the object by executing the action-and-sensing process. In other words, we have a high ability to \"integrate\" the relation of our body's action and its related sensing data, so as to improve the accuracy of sensor in our body. Based on this idea, we developed a simple perception aid device with user's explorer action, to perceive the object at a distance, which has a linked range sensor and haptic actuator, which we name \"FutureBody-Finger.\" The distance sensor measures the distance to the object (20--80[cm]), and it is converted to the angle of lever attached at the servo motor (0--60[deg]). The user holds this device in his hand with attaching his index finger on the device's lever. For the long distance to the object, the lever leans to the front, and the user feels nothing. On the other hand, for the short distance to the object, the lever stands vertically, and the user feels the existence of the object. Although the device simply measures the distance to the single point on the object, as the user \"explorers\" around him, the user can obtain more rich distance information of the surrounding object, and hence, finally perceive the shape of the whole object.","PeriodicalId":118940,"journal":{"name":"SIGGRAPH Asia 2014 Emerging Technologies","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-11-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122630592","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Haptylus: haptic stylus for interaction with virtual objects behind a touch screen Haptylus:用于与触摸屏后面的虚拟物体进行交互的触觉触控笔
SIGGRAPH Asia 2014 Emerging Technologies Pub Date : 2014-11-24 DOI: 10.1145/2669047.2669054
Shingo Nagasaka, Yuuki Uranishi, Shunsuke Yoshimoto, M. Imura, O. Oshiro
{"title":"Haptylus: haptic stylus for interaction with virtual objects behind a touch screen","authors":"Shingo Nagasaka, Yuuki Uranishi, Shunsuke Yoshimoto, M. Imura, O. Oshiro","doi":"10.1145/2669047.2669054","DOIUrl":"https://doi.org/10.1145/2669047.2669054","url":null,"abstract":"Tablet PCs and smartphones rapidly become popular nowadays. People can touch objects on the touch panel display of the tablet PC or smartphone, but only get sensation of touching the surface of the display. Recently, some systems capable of inserting themselves into the display by using retractable stylus have been proposed. Beyond [Lee and Ishii 2010] is one of these systems. It consists of a retractable stylus, a table-top display, an infrared marker and a camera set at an environment. A virtual tip of the stylus is rendered when the retractable stylus is pushed to the table-top display. The head position of the user is detected by the infrared marker and the camera, and the virtual objects and the tip of the stylus are rendered properly according to the head's position. The system enables the user to interact with a virtual object under the table. However, the stylus dose not shrink or extend automatically because the stylus dose not have any actuators such as a motor. So the user is unable to feel the haptic sensation from the virtual object. It is necessary for the user to perceive the force from the virtual object to interact with the object more realistically. Another limitation is the fact that the system is stationary. ImpAct [Withana et al. 2010] is another interaction system with a smartphone and a retractable stylus. The force feedback is represented by simply stopping the shrinkage of the stylus. However, the system gives only the rigid force feedback without tactile sensations. And the system does not give a user the tactile sensation from the virtual objects. In addition, the system does not consider the viewpoint of the user.","PeriodicalId":118940,"journal":{"name":"SIGGRAPH Asia 2014 Emerging Technologies","volume":"20 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-11-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129611126","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 7
Daily support robots that move on me 每天在我身上移动的辅助机器人
SIGGRAPH Asia 2014 Emerging Technologies Pub Date : 2014-11-24 DOI: 10.1145/2669047.2669055
Tamami Saga, N. Munekata, T. Ono
{"title":"Daily support robots that move on me","authors":"Tamami Saga, N. Munekata, T. Ono","doi":"10.1145/2669047.2669055","DOIUrl":"https://doi.org/10.1145/2669047.2669055","url":null,"abstract":"Today, wearable devices, that can constantly support us by being worn on a daily basis, are gathering attention. In contrast, although we can also find an increase of personal robots in daily life, \"wearable robots\" are not so prevalent. We developed a wearable robot as a partner, that moves on the human body autonomously. As daily support, the robot has an application to correct wearers' sitting posture. It estimates wearers' body state from some sensors, and if it perceives wearers' bad posture or habit, points them out by moving to a region of the problem directly. We may be able to make use of it, not only to correct our posture or bad habit, but especially, to train children.","PeriodicalId":118940,"journal":{"name":"SIGGRAPH Asia 2014 Emerging Technologies","volume":"36 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-11-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125394534","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Enforced telexistence: teleoperating using photorealistic virtual body and haptic feedback 强制远程存在:远程操作使用逼真的虚拟身体和触觉反馈
SIGGRAPH Asia 2014 Emerging Technologies Pub Date : 2014-11-24 DOI: 10.1145/2669047.2669048
M. Y. Saraiji, C. Fernando, Yusuke Mizushina, Yoichi Kamiyama, K. Minamizawa, S. Tachi
{"title":"Enforced telexistence: teleoperating using photorealistic virtual body and haptic feedback","authors":"M. Y. Saraiji, C. Fernando, Yusuke Mizushina, Yoichi Kamiyama, K. Minamizawa, S. Tachi","doi":"10.1145/2669047.2669048","DOIUrl":"https://doi.org/10.1145/2669047.2669048","url":null,"abstract":"Telexistence [Tachi 2010] systems require physical limbs for remote object manipulation [Fernando et al. 2012]. Having arms and hands synchronized with voluntary movements allows the user to feel robot's body as his body through visual, and haptic sensation. In this method, we introduce a novel technique that provides virtual arms for existing telexistence systems that does not have physical arms. Previous works [Mine et al. 1997; Poupyrev et al. 1998; Nedel et al. 2003] involved the study of using virtual representation of user hands in virtual environments for interactions. In this work, the virtual arms serves for several interactions in a physical remote environment, and most importantly they provide the user the sense of existence in that remote environment. These superimposed virtual arms follows the user's real-time arm movements and reacts to the dynamic lighting of real environment providing photorealistic rendering adapting to remote place lighting. Thus, it allows the user to have an experience of embodied enforcement towards the remote environment. Furthermore, these virtual arms can be extended to touch and feel unreachable remote objects, and to grab a functional virtual copy of a physical instance where device control is possible. This method does not only allow the user to experience a non-existing arm in telexistence, but also gives the ability to enforce remote environment in various ways.","PeriodicalId":118940,"journal":{"name":"SIGGRAPH Asia 2014 Emerging Technologies","volume":"21 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-11-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116695549","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 9
One-man orchestra: conducting smartphone orchestra 一人管弦乐队:指挥智能手机管弦乐队
SIGGRAPH Asia 2014 Emerging Technologies Pub Date : 2014-11-24 DOI: 10.1145/2669047.2669049
Chun Kit Tsui, Chi Hei Law, Hongbo Fu
{"title":"One-man orchestra: conducting smartphone orchestra","authors":"Chun Kit Tsui, Chi Hei Law, Hongbo Fu","doi":"10.1145/2669047.2669049","DOIUrl":"https://doi.org/10.1145/2669047.2669049","url":null,"abstract":"This work presents a new platform for performing one-man orchestra (Figure 1). The conductor is the only human involved, who uses traditional bimanual conducting gestures to interactively direct the performance of smartphones instead of human performers in a real-world orchestra. Each smartphone acts as a virtual performer who plays a certain music instrument like piano and violin. Our work not only allows ordinary people to experience music conducting but also provides a training platform so that students can practice music conducting with a unique listening experience.","PeriodicalId":118940,"journal":{"name":"SIGGRAPH Asia 2014 Emerging Technologies","volume":"70 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-11-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124562889","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
A-blocks: recognizing and assessing child building processes during play with toy blocks 积木:认识和评估儿童在玩积木时的建构过程
SIGGRAPH Asia 2014 Emerging Technologies Pub Date : 2014-11-24 DOI: 10.1145/2669047.2669061
Toshiki Hosoi, Kazuki Takashima, T. Adachi, Yuichi Itoh, Y. Kitamura
{"title":"A-blocks: recognizing and assessing child building processes during play with toy blocks","authors":"Toshiki Hosoi, Kazuki Takashima, T. Adachi, Yuichi Itoh, Y. Kitamura","doi":"10.1145/2669047.2669061","DOIUrl":"https://doi.org/10.1145/2669047.2669061","url":null,"abstract":"We propose A-Blocks, a novel building block device that enables detecttion and recognition of children's actions and interactions when building with blocks. Quantitative data received from constructing and breaking A-Blocks can be valuable for various assessment applications (e.g., play therapy, cognitive testing, and education). In our prototype system, each block embeds a wireless measurement device that inclludes acceleration, angular velocity, and geomagnetic sensors to measure a block's spatial motion and posture during children's play. A standard set of blocks can be managed via Bluetooth in real time. By using combined sensor data, the system can estimate how to stack the blocks on each other by detecting surface collisions (Figure 1) and recognize many fundamental play action patterns (e.g., moving, stacking standing, waving) with SVM. Unlike existing block-shaped devices with phyysical constraints on their connections (e.g., electrical hooks, magnets), our solid and traditional-shaped block device supports flexible block play that could include more delicate motions reflecting a child's inner state (e.g., learning stages, stress level, representation of an imagination). These benefits of analyzing children's block play can be extended to allow for more enjoyable and interactive play, while social impacts include more constructive play.","PeriodicalId":118940,"journal":{"name":"SIGGRAPH Asia 2014 Emerging Technologies","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-11-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131868601","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 16
Dancer-in-a-box Dancer-in-a-box
SIGGRAPH Asia 2014 Emerging Technologies Pub Date : 2014-11-24 DOI: 10.1145/2669047.2669053
Yuichiro Katsumoto
{"title":"Dancer-in-a-box","authors":"Yuichiro Katsumoto","doi":"10.1145/2669047.2669053","DOIUrl":"https://doi.org/10.1145/2669047.2669053","url":null,"abstract":"Dancer-in-a-Box is a project to create a self-propelled cardboard box without an external drive system such as a wheel or propeller. Since the dawn of time, humans have used a box as a static object to store and transport everyday things easily. Therefore, we began to seek a new usage of the box as an active object. Based on this concept, we developed a self-rolling robotic cube which can be installed in a cardboard box. Also, this research created several applications using our robotic cube for entertainment purposes.","PeriodicalId":118940,"journal":{"name":"SIGGRAPH Asia 2014 Emerging Technologies","volume":"98 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-11-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124915120","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
SmartSail: visualizing wind force on the sail to learn and enjoy sailing easily SmartSail:可视化风帆上的风力,轻松学习和享受航海
SIGGRAPH Asia 2014 Emerging Technologies Pub Date : 2014-11-24 DOI: 10.1145/2669047.2669050
Koh Sueda
{"title":"SmartSail: visualizing wind force on the sail to learn and enjoy sailing easily","authors":"Koh Sueda","doi":"10.1145/2669047.2669050","DOIUrl":"https://doi.org/10.1145/2669047.2669050","url":null,"abstract":"We have created SmartSail, a radio controlled (R/C) sailboat toy to learn sailing in easy and enjoyable way. SmartSail, an augmented-feedback user interfaces to visualize force on the sail to make controlling a sailboat easier.","PeriodicalId":118940,"journal":{"name":"SIGGRAPH Asia 2014 Emerging Technologies","volume":"4 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-11-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127061114","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Click beetle-like jumping device for entertainment 点击甲虫般的跳跃装置娱乐
SIGGRAPH Asia 2014 Emerging Technologies Pub Date : 2014-11-24 DOI: 10.1145/2669047.2669059
Akihiko Fukushima, Y. Kawaguchi
{"title":"Click beetle-like jumping device for entertainment","authors":"Akihiko Fukushima, Y. Kawaguchi","doi":"10.1145/2669047.2669059","DOIUrl":"https://doi.org/10.1145/2669047.2669059","url":null,"abstract":"We developed a interactive jumping device mimicking the jumping behavior of click beetle. The purpose of this study is exploring the application of biomimetics to digital entertainment. Specifically, We focused on the psychological impact caused by jumping of click beetle. It is assumed that click beetle-inspired device can create the application of biomimetics to affect the emotion of the people. In addition, the device designed small and light is so secure for people as to can be touched. Consequently, the touchable interaction can produce the dense experience for entertainment.","PeriodicalId":118940,"journal":{"name":"SIGGRAPH Asia 2014 Emerging Technologies","volume":"25 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-11-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125663250","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信