{"title":"GestureWrist and GesturePad: unobtrusive wearable interaction devices","authors":"J. Rekimoto","doi":"10.1109/ISWC.2001.962092","DOIUrl":"https://doi.org/10.1109/ISWC.2001.962092","url":null,"abstract":"In this paper we introduce two input devices for wearable computers, called GestureWrist and GesturePad. Both devices allow users to interact with wearable or nearby computers by using gesture-based commands. Both are designed to be as unobtrusive as possible, so they can be used under various social contexts. The first device, called GestureWrist, is a wristband-type input device that recognizes hand gestures and forearm movements. Unlike DataGloves or other hand gesture-input devices, all sensing elements are embedded in a normal wristband. The second device, called GesturePad, is a sensing module that can be attached on the inside of clothes, and users can interact with this module from the outside. It transforms conventional clothes into an interactive device without changing their appearance.","PeriodicalId":239921,"journal":{"name":"Proceedings Fifth International Symposium on Wearable Computers","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2001-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129848790","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Kazushi Nishimoto, T. Maekawa, Yukio Tada, K. Mase, R. Nakatsu
{"title":"Networked wearable musical instruments will bring a new musical culture","authors":"Kazushi Nishimoto, T. Maekawa, Yukio Tada, K. Mase, R. Nakatsu","doi":"10.1109/ISWC.2001.962096","DOIUrl":"https://doi.org/10.1109/ISWC.2001.962096","url":null,"abstract":"Many people enjoy music daily, but in a very passive way. To more actively enjoy music, some novel musical instruments are necessary. As a candidate for a novel musical instrument, we propose a networked wearable musical instrument. This paper describes the design of the networked musical instrument as well as a prototype system. Furthermore, we demonstrate several possible novel applications of the networked wearable musical instrument. We think that this networked wearable musical instrument will bring us a novel type of musical entertainment and a novel musical culture.","PeriodicalId":239921,"journal":{"name":"Proceedings Fifth International Symposium on Wearable Computers","volume":"72 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2001-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130738147","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Tinmith-Metro: new outdoor techniques for creating city models with an augmented reality wearable computer","authors":"W. Piekarski, B. Thomas","doi":"10.1109/ISWC.2001.962093","DOIUrl":"https://doi.org/10.1109/ISWC.2001.962093","url":null,"abstract":"This paper presents new techniques for capturing and viewing on site 3D graphical models for large outdoor objects. Using an augmented reality wearable computer, we have developed a software system, known as Tinmith-Metro. Tinmith-Metro allows users to control a 3D constructive solid geometry modeller for building graphical objects of large physical artefacts, for example buildings, in the physical world. The 3D modeller is driven by a new user interface known as Tinmith-Hand, which allows the user to control the modeller using a set of pinch gloves and hand tracking. These techniques allow user to supply their AR renderers with models that would previously have to be captured with manual, time-consuming, and/or expensive methods.","PeriodicalId":239921,"journal":{"name":"Proceedings Fifth International Symposium on Wearable Computers","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2001-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126399337","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Authoring of physical models using mobile computers","authors":"Y. Baillot, Dennis G. Brown, S. Julier","doi":"10.1109/ISWC.2001.962094","DOIUrl":"https://doi.org/10.1109/ISWC.2001.962094","url":null,"abstract":"Context-aware computers rely on user and physical models to describe the context of a user. In this paper, we focus on the problem of developing and maintaining a physical model of the environment using a mobile computer. We describe a set of tools for automatically creating and modifying three-dimensional contextual information. The tools can be utilized across multiple hardware platforms, with different capabilities, and operating in collaboration with one another. We demonstrate the capabilities of the tools using two mobile platforms. One of them, a mobile augmented reality system is used to construct a geometric model of an indoor environment which is then visualized on the same platform.","PeriodicalId":239921,"journal":{"name":"Proceedings Fifth International Symposium on Wearable Computers","volume":"282 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2001-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116082944","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The Witnessential Net","authors":"S. Mann, R. Guerra","doi":"10.1109/ISWC.2001.962095","DOIUrl":"https://doi.org/10.1109/ISWC.2001.962095","url":null,"abstract":"The Witnessential Network for the protection of Human Rights workers, and others who may be subjected to violence, is achieved through a new kind of imaging and hierarchical architecture having special properties ideal for defense against unaccountability of attackers. Incidentalist video capture and self-demotion are introduced as new collegial forms of defense against unaccountability. Results of various experiments conducted worldwide over the past 20 years, on the inventing, designing, building, and using of wearable photographic apparatus having these special properties are also described. Other fundamental concepts with respect to a Personal Safety Device suitable for Human Rights workers are introduced.","PeriodicalId":239921,"journal":{"name":"Proceedings Fifth International Symposium on Wearable Computers","volume":"36 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2001-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124512211","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Real-time analysis of data from many sensors with neural networks","authors":"Kristof Van Laerhoven, K. Aidoo, S. Lowette","doi":"10.1109/ISWC.2001.962112","DOIUrl":"https://doi.org/10.1109/ISWC.2001.962112","url":null,"abstract":"Much research has been conducted that uses sensor-based modules with dedicated software to automatically distinguish the user's situation or context. The best results were obtained when powerful sensors (such as cameras or GPS systems) and/or sensor-specific algorithms (like sound analysis) were applied A somewhat new approach is to replace the one smart sensor by many simple sensors. We argue that neural networks are ideal algorithms to analyze the data coming from these sensors and describe how we came to one specific algorithm that gives good results, by giving an overview of several requirements. Finally, wearable implementations are given to show the feasibility and benefits of this approach and its implications.","PeriodicalId":239921,"journal":{"name":"Proceedings Fifth International Symposium on Wearable Computers","volume":"31 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2001-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124957506","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Real-time hazard detection via machine vision for wearable low vision aids","authors":"Jordan Andersen, E. Seibel","doi":"10.1109/ISWC.2001.962137","DOIUrl":"https://doi.org/10.1109/ISWC.2001.962137","url":null,"abstract":"The goal of the wearable low vision aid project is to create an assistive wearable device using optical scanning virtual retinal displays and wearable computer technologies for the enhancement of vision. The system is made up of a very small head-mounted camera, wearable computer and a low-cost compact retinal display. These three components collect, interpret and present in,formation about the indoor and outdoor environments respectively. The role of the software is to maximize the amount of useful information entering the eye. The anticipated result from the image processing is to identify the common navigational hazards of curbs, stairs and doorways and convey this information to a low vision user in one of several formats.","PeriodicalId":239921,"journal":{"name":"Proceedings Fifth International Symposium on Wearable Computers","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2001-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129632216","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Chris Baber, J. Cross, Sandra I. Woolley, V. Gaffney
{"title":"Wearable computing for field archaeology","authors":"Chris Baber, J. Cross, Sandra I. Woolley, V. Gaffney","doi":"10.1109/ISWC.2001.962127","DOIUrl":"https://doi.org/10.1109/ISWC.2001.962127","url":null,"abstract":"Wearable computers offer many potential uses in field archaeology. Our paper presents the uses identified in a Field Archaeology Wearable Computing Workshop held at The University of Birmingham, UK, and describes the challenges and outcomes of a prototype system trialed with 'The Birmingham University Field Archaeology Unit' at the Forum Novum archaeological site, Italy.","PeriodicalId":239921,"journal":{"name":"Proceedings Fifth International Symposium on Wearable Computers","volume":"13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2001-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116495961","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Lightglove: wrist-worn virtual typing and pointing","authors":"Bruce Howard, Susie Howard","doi":"10.1109/ISWC.2001.962130","DOIUrl":"https://doi.org/10.1109/ISWC.2001.962130","url":null,"abstract":"We present the \"Lightglove\", a watch-size wireless virtual typing device worn underneath the wrist(s), with light beams sensing fingertips and motion sensors tracking hand movement. A miniature keyboard image is superimposed over the user's application on the host system display, and active keys corresponding to each finger are highlighted and pan around the key-map as the hand moves. Typing and pointing actions are similar to those with conventional keyboard and mouse, as though a physical device were underneath the hand. When the cursor moves out of the keyboard display area the Lightglove acts as a pointer (mouse). Applications include computer/PDA input, gaming control, TV remote control, and musical applications.","PeriodicalId":239921,"journal":{"name":"Proceedings Fifth International Symposium on Wearable Computers","volume":"13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2001-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127797687","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Michael Boronowsky, T. Nicolai, C. Schlieder, Ansgar Schmidt
{"title":"Winspect: a case study for wearable computing-supported inspection tasks","authors":"Michael Boronowsky, T. Nicolai, C. Schlieder, Ansgar Schmidt","doi":"10.1109/ISWC.2001.962124","DOIUrl":"https://doi.org/10.1109/ISWC.2001.962124","url":null,"abstract":"Introduces the Winspect project-an application of wearable computing in an industrial inspection process-with focus on its user interface. We present a case study to demonstrate the benefit of wearable input devices and the use of implicit interaction as a complementary technique. Two almost independent tasks from the application domain are addressed: the input of findings for inspected components in a harsh environment, and a technique to overcome the display resolution when browsing a hypertext-like documentation.","PeriodicalId":239921,"journal":{"name":"Proceedings Fifth International Symposium on Wearable Computers","volume":"28 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2001-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123972305","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}