{"title":"Drishti: an integrated navigation system for visually impaired and disabled","authors":"A. Helal, Steve Moore, B. Ramachandran","doi":"10.1109/ISWC.2001.962119","DOIUrl":"https://doi.org/10.1109/ISWC.2001.962119","url":null,"abstract":"Drishti is a wireless pedestrian navigation system. It integrates several technologies including wearable computers, voice recognition and synthesis, wireless networks, Geographic Information System (GIS) and Global positioning system (GPS). Drishti augments contextual information to the visually impaired and computes optimized routes based on user preference, temporal constraints (e.g. traffic congestion), and dynamic obstacles (e.g. ongoing ground work, road blockade for special events). The system constantly guides the blind user to navigate based on static and dynamic data. Environmental conditions and landmark information queried from a spatial database along their route are provided on the fly through detailed explanatory voice cues. The system also provides capability for the user to add intelligence, as perceived by, the blind user, to the central server hosting the spatial database. Our system is supplementary to other navigational aids such as canes, blind guide dogs and wheel chairs.","PeriodicalId":239921,"journal":{"name":"Proceedings Fifth International Symposium on Wearable Computers","volume":"44 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2001-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123563355","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Wearable computers as packet transport mechanisms in highly-partitioned ad-hoc networks","authors":"J. Davis, A. Fagg, B. Levine","doi":"10.1109/ISWC.2001.962117","DOIUrl":"https://doi.org/10.1109/ISWC.2001.962117","url":null,"abstract":"The decreasing size and cost of wearable computers and mobile sensors is presenting new challenges and opportunities for deploying networks. Existing network routing protocols provide reliable communication between nodes and allow for mobility and even ad-hoc deployment. They rely, however on the assumption of a dense scattering of nodes and end-to-end connectivity in the network. In this paper we address routing support for ad-hoc, wireless networks under conditions of sporadic connectivity and ever-present network partitions. This work proposes a general framework of agent movement and communication in which mobile computers physically carry packets across network partitions. We then propose algorithms that exploit the relative position of stationary devices and non-randonmess in the movement of mobile agents in the network. The learned structure of the network is used to inform an adaptive routing strategy With a simulation, we evaluate these algorithms and their ability to route packets efficiently through a highly-partitioned network.","PeriodicalId":239921,"journal":{"name":"Proceedings Fifth International Symposium on Wearable Computers","volume":"23 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2001-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123860212","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Mobile capture for wearable computer usability testing","authors":"Kent Lyons, Thad Starner","doi":"10.1109/ISWC.2001.962099","DOIUrl":"https://doi.org/10.1109/ISWC.2001.962099","url":null,"abstract":"The-mobility of wearable computers makes usability-testing; difficult. In order to fully understand how a user interacts with the wearable, the researcher must examine, both the user's direct interactions, with the, computer, as well as the external context the user perceives during their interaction. We present, a tool that augments a wearable computer with additional hardware and software to capture the information needed to perform a usability study in the field under realistic conditions. We examine the challenges in doing the capture and present our implementation. We also describe VizWear a tool for examining the captured data. Finally, we present our experiences using the system for a sample user study.","PeriodicalId":239921,"journal":{"name":"Proceedings Fifth International Symposium on Wearable Computers","volume":"85 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2001-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127410855","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"HI-Cam: intelligent biofeedback signal processing","authors":"Steve Mann, Daniel Chen, Sam Sadeghi","doi":"10.1109/ISWC.2001.962135","DOIUrl":"https://doi.org/10.1109/ISWC.2001.962135","url":null,"abstract":"Humanistic intelligence (HI) is defined by two embodying elements. (1) It is a signal processing framework in which the human and the computer use each other in a feedback loop. (2) The HI processing apparatus is inextricably intertwined with the natural capabilities of the human mind and body. The Humanistic Intelligent Camera, or HI-Cam, is a wearable personal imaging application of HI. It uses physiological signal processing such as fast Fourier transform analysis on EEG signals to control various parameters of a personal cybernetics system. EyeTap devices are particularly well suited to being controlled by brainwaves, because the parameters of the system are observable on the screen of the EyeTap viewfinder. This provided a direct means of physiological control over a video capture program, creating a camera system embodying humanistic intelligence.","PeriodicalId":239921,"journal":{"name":"Proceedings Fifth International Symposium on Wearable Computers","volume":"99 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2001-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122229885","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
G. Kaefer, J. Haid, Bernd Hofer, Gerhard Schall, R. Weiss
{"title":"Framework for power aware remote processing: design and implementation of a dynamic power estimation unit","authors":"G. Kaefer, J. Haid, Bernd Hofer, Gerhard Schall, R. Weiss","doi":"10.1109/ISWC.2001.962121","DOIUrl":"https://doi.org/10.1109/ISWC.2001.962121","url":null,"abstract":"In this paper we present a \"Framework for Power Aware Remote Processing\" to minimize the energy consumption of mobile devices transparently. The main difference to remote processing frameworks already published consists in a novel integrated dynamic power estimation unit. This is an adaptive power consumption estimator, which estimates the energy consumption of software and system components. Based on the estimated power consumption an intelligent power manager migrates software components from the mobile device to remote machines, thus reducing the energy consumption of the mobile device.","PeriodicalId":239921,"journal":{"name":"Proceedings Fifth International Symposium on Wearable Computers","volume":"63 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2001-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126372457","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
D. Rossi, F. Lorussi, A. Mazzoldi, E. Scilingo, P. Orsini
{"title":"Active dressware: wearable proprioceptive systems based on electroactive polymers","authors":"D. Rossi, F. Lorussi, A. Mazzoldi, E. Scilingo, P. Orsini","doi":"10.1109/ISWC.2001.962123","DOIUrl":"https://doi.org/10.1109/ISWC.2001.962123","url":null,"abstract":"A technology based on conducting polymers and rubber microdispersed carbon phases enables the realization of truly wearable instrumented garments capable of recording proprioceptive maps and of stiffening body and hand segments with no discomfort for the subject, negligible motion artifacts and body overload Applications are foreseen in virtual and augmented reality, teleoperation and rehabilitation fields.","PeriodicalId":239921,"journal":{"name":"Proceedings Fifth International Symposium on Wearable Computers","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2001-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131773450","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A wearable 3D augmented reality workspace","authors":"Gerhard Reitmayr, D. Schmalstieg","doi":"10.1109/ISWC.2001.962125","DOIUrl":"https://doi.org/10.1109/ISWC.2001.962125","url":null,"abstract":"Describes our work to build a wearable augmented reality (AR) system that supports true stereoscopic 3D graphics. Through a pen and pad interface, well known 2D user interfaces can be presented to the user whereas the tracking of the pen allows us to use direct interaction with virtual objects. The system is assembled from off-the-shelf hardware components and serves as a basic test bed for user interface experiments related to collaboration between stationary and mobile AR users.","PeriodicalId":239921,"journal":{"name":"Proceedings Fifth International Symposium on Wearable Computers","volume":"5 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2001-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121250929","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A wearable cross-language communication aid","authors":"Jani Patokallio, Nigel G. Ward","doi":"10.1109/ISWC.2001.962133","DOIUrl":"https://doi.org/10.1109/ISWC.2001.962133","url":null,"abstract":"Presents a wearable device, the Yak, that aids cross-language communication. The Yak produces utterances in the native's language at the user's command. The use of a heads-up display allows the user to operate the device while maintaining eye contact with the native and sending and receiving non-verbal signals. The paper describes requirements for a communication aid, the Yak's user interface, the Yak's current hardware configuration and scripting language, and the results of the first round of experiments.","PeriodicalId":239921,"journal":{"name":"Proceedings Fifth International Symposium on Wearable Computers","volume":"53 97 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2001-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116033882","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A method of key input with two mice","authors":"Satoshi Nakamura, M. Tsukamoto, S. Nishio","doi":"10.1109/ISWC.2001.962091","DOIUrl":"https://doi.org/10.1109/ISWC.2001.962091","url":null,"abstract":"Recently, due to remarkable advancements in computer technology, small mobile computers such as PDAs (Personal Digital Assistants) and palmtop computers are being developed. In the near future, it is feasible that wearable computers will become commonplace. Up until now, some text input systems for wearable computers have been proposed. However, these systems, depending on the situation, have many problems like low speed input and restrictions on usage. In this paper, we propose a text input method for wearable computing environments utilizing two trackballs based on our previous proposal, entitled \"DoubleMouse\". This method supports the wearable property that it can be utilized anytime and anyplace. In addition, based on the movement direction of the two mice this method can rapidly select a user's desired input symbols into the computer, thusly mimicking a keyboard. Further, this method supports a \"blind\" input mode that allows input without displaying additional components on the screen. We also show how the developed system is useful in practical operations based on our experiments.","PeriodicalId":239921,"journal":{"name":"Proceedings Fifth International Symposium on Wearable Computers","volume":"195 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2001-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116221642","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}