{"title":"Methodology for Efficient Perception in Exclusively Haptic Environments","authors":"K. Kahol, T. McDaniel, S. Panchanathan","doi":"10.1109/HAVE.2006.283781","DOIUrl":"https://doi.org/10.1109/HAVE.2006.283781","url":null,"abstract":"In haptic environments, it is especially demanding to design realistic interaction paradigms and provide global navigational cues. In this paper, we present a methodology that can replace and/or augment realistic haptic environments and is inspired by the psychological basis of haptics. The system employs haptic cueing to convey information about shape, size, texture, and material of the object through user-determined cues. The key conceptual framework that guides this approach is that humans have haptical memory of an object, and sparse data about the object features presented through cues can invoke spatial concepts that reveal the identity of the object. Secondly, the system represents an object's surface as a 2D raised surface map in the virtual environment. Since the haptic modality is specialized to perceive surface properties, surface rendering mimics the real environment and presents veridical sensations about the surface. We compare this methodology to conventional rendering of haptic objects where the object is sensed in its entirety through a haptic interface. The paper further studies usability of realistic haptic rendering in 1) egocentric and exocentric reference frames, and 2) with tactile and/or force feedback. Initial results indicate that tactile cueing in combination with realistic rendering of surfaces with force feedback in an egocentric reference frame leads to the most efficient perception","PeriodicalId":365320,"journal":{"name":"2006 IEEE International Workshop on Haptic Audio Visual Environments and their Applications (HAVE 2006)","volume":"28 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127142186","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
M. Howell Jones, A. Arcelus, R. Goubran, F. Knoefel
{"title":"A Pressure Sensitive Home Environment","authors":"M. Howell Jones, A. Arcelus, R. Goubran, F. Knoefel","doi":"10.1109/HAVE.2006.283799","DOIUrl":"https://doi.org/10.1109/HAVE.2006.283799","url":null,"abstract":"Homes could be equipped with unobtrusive pressure sensors to monitor older adults. This paper deals with the processing, analysis and communication of pressure sensor outputs that would enable such monitoring. An example is shown of an adult of 63 years who slept over top of a pressure sensor array. Her nocturnal respiratory rate was monitored via the pressure sensor array. Additionally, her bed time, rise time, and out-of-bed times were accurately recorded using the methods proposed herein. This information was presented through a secure Web interface, which would allow a caregiver simple and intuitive access to client data","PeriodicalId":365320,"journal":{"name":"2006 IEEE International Workshop on Haptic Audio Visual Environments and their Applications (HAVE 2006)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126679586","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Tactile sensing by the sole of the foot: part i: apparatus and initial experiments toward obtaining dynamic pressure maps useful for stabilizing standing, walking, and running of humanoid robots","authors":"A. Kalamdani, C. Messom, M. Siegel","doi":"10.1109/HAVE.2006.283783","DOIUrl":"https://doi.org/10.1109/HAVE.2006.283783","url":null,"abstract":"This paper introduces prototype experimental apparatus to investigate stability in standing, walking and running of humanoid robots using pressure sensing at the foot contact. The preliminary experiments show that the system can provide very good spatial or temporal resolution and these can be traded off each other given the problem at hand, such as the sparsely sampled whole foot during static balancing or the densely sampled impact point of the foot during walking or running. The prototype apparatus, experimental result, and dynamic models of the system will give insight into the nature of balance control","PeriodicalId":365320,"journal":{"name":"2006 IEEE International Workshop on Haptic Audio Visual Environments and their Applications (HAVE 2006)","volume":"29 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123418950","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Enabling mobile haptic design: piezoelectric actuator technology properties in hand held devices","authors":"P. Laitinen, J. Maenpaa","doi":"10.1109/HAVE.2006.283787","DOIUrl":"https://doi.org/10.1109/HAVE.2006.283787","url":null,"abstract":"Mobile haptics and multimodal design has been strongly constrained by the lack of enabling actuator technologies. In this paper property of the piezoelectric actuator technology for haptic feedback in mobile devices is described. By the careful actuator design and its integration to mobile devices this technology provides realistic possibility for haptic and multimodal design with considerable latitude. Characteristics of mobile device usage are described and their impact to haptic design is discussed","PeriodicalId":365320,"journal":{"name":"2006 IEEE International Workshop on Haptic Audio Visual Environments and their Applications (HAVE 2006)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125765666","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"MPEG-7 Description of Haptic Applications Using HAML","authors":"M. Eid, A. Alamri, A. E. Saddik","doi":"10.1109/HAVE.2006.283780","DOIUrl":"https://doi.org/10.1109/HAVE.2006.283780","url":null,"abstract":"The continuous evolution of computer haptics, as well as the emergence of a wide range of haptic interfaces has recently boosted the haptics domain. Even though efficient tools that support the developer's work exist, little attention is paid to the reuse and compatibility of haptic application constituents. In response to these issues, we propose an XML-based description language, namely Haptic Application Meta Language - HAML. HAML is designed to provide a technology-neutral description of haptic models. It contains ergonomic requirements and specifications for haptic hardware and software interactions. The envisioned goal is to allow for the creation of plug-and-play environments in which a wide array of supported haptic devices can be used in a multitude of virtual environments, with the compatibility issues being handled by automated engines instead of programmatically by the user. As per implementation, MPEG-7 standard has been used to instantiate HAML schema through the use of description schemes (DS). Our preliminary experimentation demonstrates the suitability of HAML for solving the compatibility issue","PeriodicalId":365320,"journal":{"name":"2006 IEEE International Workshop on Haptic Audio Visual Environments and their Applications (HAVE 2006)","volume":"2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129142070","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"InteractiveVirtual Simulation for Multiple Camera Placement","authors":"J. Williams, Won-sook Lee","doi":"10.1109/HAVE.2006.283777","DOIUrl":"https://doi.org/10.1109/HAVE.2006.283777","url":null,"abstract":"Camera placement plays an integral role in image-based 3D object reconstruction. Poor camera placement can lead to poor reconstruction if important silhouette or texture details are not captured in one view, if a camera is too close and the object silhouette is clipped, or if a camera is too far to provide a high-resolution capture. Some of these constraints are in contention. Manually adjusting real world cameras to satisfy these criteria is difficult and tedious as cameras may have to be unmounted and moved, their output checked for fitness of view, then readjusted. An interactive virtual camera positioning utility is presented that helps the reconstructor quickly and easily choose intuitively good positions, with emphasis on preventing clipping and resolution loss by visualizing the camera-set's mutual viewable volume","PeriodicalId":365320,"journal":{"name":"2006 IEEE International Workshop on Haptic Audio Visual Environments and their Applications (HAVE 2006)","volume":"429 1-3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123437174","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"GJK for Deformable Object Collision Detection","authors":"M. Hatab, A. Kheddar","doi":"10.1109/HAVE.2006.283805","DOIUrl":"https://doi.org/10.1109/HAVE.2006.283805","url":null,"abstract":"GJK is one of the main methods for distance calculations between convex objects. In this article, the adaptation of GJK for deformable object collision detection is proposed. Although the original method is only suited for distance calculations, the proposed method is capable of finding the colliding triangle pairs between two continuously deforming virtual objects in real time. Furthermore, it handles all deformation types at no extra time cost and it allows the client application to add, or remove triangles from the considered object meshes at run time with no extra overhead for the collision detection. The proposed method is very exible in many aspects, making it an ideal choice for virtual reality and haptic applications","PeriodicalId":365320,"journal":{"name":"2006 IEEE International Workshop on Haptic Audio Visual Environments and their Applications (HAVE 2006)","volume":"23 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130710200","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Hapto-audio-visual environments for collaborative training of ophthalmic surgery over optical network","authors":"P. Boulanger, G. Wu, W. Bischof, Xing-Dong Yang","doi":"10.1109/HAVE.2006.283801","DOIUrl":"https://doi.org/10.1109/HAVE.2006.283801","url":null,"abstract":"This paper presents the results of a two-year project to develop a shared hapto-visual-audio-virtual environment (HAVE) with advanced multi-point video conferencing, new display and interface technologies, and distributed latency-compensated haptic technologies for collaborative medical research and training in ophthalmology. One of the goals of this project is to create collaborative training environment, in which residents can remotely learn, in real-time, cataract operations from real operations performed by teaching surgeons. The assumption of this work is that a trainee surgeon can learn the complex hand-eye coordination necessary for becoming a good ophthalmic surgeon by feeling and seeing every move the expert surgeon makes, through a complex haptic, auditory, and visual playback interface. Experimental results are presented","PeriodicalId":365320,"journal":{"name":"2006 IEEE International Workshop on Haptic Audio Visual Environments and their Applications (HAVE 2006)","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114620520","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Compression of cubic-panorama datasets with spatially consistent representation","authors":"Kehua Jiang, E. Dubois","doi":"10.1109/HAVE.2006.283774","DOIUrl":"https://doi.org/10.1109/HAVE.2006.283774","url":null,"abstract":"Efficient compression of cubic-panorama datasets is crucial to reducing the required storage space and transmission bandwidth. In traditional coding schemes, cubic-panorama datasets are treated as planar image sequences with a rectangular support. In this way, it is inevitable to sacrifice some coding efficiency because of the inconsistency on the boundaries of connected side images. We have developed a spatially consistent representation for cubic panoramas, and applied a motion-compensated temporal filtering (MCTF) coding scheme to compress cubic-panorama datasets. Specific approaches for constructing the reference blocks on the corners of cubes are designed for motion estimation. The search for motion vectors in reference frames can be naturally extended across side-image boundaries into neighbor side images. The spatially consistent representation of cubic panoramas eliminates image boundary constraints for motion vector search. The search for motion vectors as well as matching reference blocks can be extended well beyond the side image boundaries in all four directions. Better matched reference blocks can be obtained to further reduce the prediction errors and improve the compression efficiency. The compression scheme is adapted to the features of cubic-panorama datasets. The experimental results of applying the proposed compression scheme to coding sample cubic-panorama datasets are presented. It is shown that superior coding performance is achieved with the spatially consistent representation compared with the generic representation of cubic-panorama datasets","PeriodicalId":365320,"journal":{"name":"2006 IEEE International Workshop on Haptic Audio Visual Environments and their Applications (HAVE 2006)","volume":"365 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116519992","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Zone Based Messaging in Collaborative Virtual Environments","authors":"D. Ahmed, S. Shirmohammadi, H. Kazem","doi":"10.1109/HAVE.2006.283794","DOIUrl":"https://doi.org/10.1109/HAVE.2006.283794","url":null,"abstract":"Massively multi-user simulation requires synchronous communication among the parties. In this paper, we present a multiuser collaboration architecture that divides the virtual world in multiple adjacent hexagonal regions in order to properly organize the entities and efficiently manage their liaison. An especial node, named hybrid node, be in charge of each hexagonal region and constructs a data distribution tree at the application layer rather at the network layer. While constructing the data distribution pathways among the end-hosts, protocol focuses to reflect the physical topology onto the overlay network to enhance the system performance. To control the excessive message overhead, necessary messages of a foreign region are only imported when needed to a particular region through a hybrid node. Dynamic adjustment of cheek-in and check-out marks reduces frequent connections and disconnections between a hybrid and an ordinary node, and provides resilience to the system. The effectiveness of this collaboration architecture is tested through the implementation","PeriodicalId":365320,"journal":{"name":"2006 IEEE International Workshop on Haptic Audio Visual Environments and their Applications (HAVE 2006)","volume":"18 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121980023","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}