{"title":"3D Input Using Hand-held Objects and Computer Vision","authors":"M. Fiala","doi":"10.1109/HAVE.2006.283771","DOIUrl":"https://doi.org/10.1109/HAVE.2006.283771","url":null,"abstract":"An untethered hand-held object whose pose can be determined automatically is useful for human-computer interaction (HCI) applications such as gaming and augmented reality (AR). Computer vision and ducial marker systems can be employed to turn ergonomic objects such as rubber balls, toy swords and guns, balloons, etc. into low cost input devices. A digital video camera such as a consumer webcam is all that is needed to allow the 3D pose of a naturally held object to be used as input to control a CAD system, game, or other GUI interface. Markers are placed on the object whose positions are learned by the system automatically and used to determine object pose in a real time system","PeriodicalId":365320,"journal":{"name":"2006 IEEE International Workshop on Haptic Audio Visual Environments and their Applications (HAVE 2006)","volume":"24 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-04-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127173187","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
M. Kasakevich, P. Boulanger, W. Bischof, M. García
{"title":"Multi-Modal Interface for a Real-Time CFD Solver","authors":"M. Kasakevich, P. Boulanger, W. Bischof, M. García","doi":"10.1109/HAVE.2006.283800","DOIUrl":"https://doi.org/10.1109/HAVE.2006.283800","url":null,"abstract":"Advances in computer processing power and networking over the past few years have brought significant changes to the modeling and simulation of complex phenomena. Problems that formerly could only be tackled in batch mode, with their results visualized afterwards, can now be monitored whilst in progress using graphical means. In certain cases, it is even possible to alter parameters of the computation whilst it is running, depending on what the scientist perceives in the current visual output. This ability to monitor and change parameters of the computational process at any time and from anywhere is called computational steering. Combining this capability with advanced multi-modal tools to explore the data produced by these systems are key to our approach. In this paper, we present an advanced multi-modal interface where sonification and 3D visualization are used in a computational steering environment specialized to solve real-time Computational Fluid Dynamics (CFD) problems. More specifically, this paper describes how sonification of CFD data can be used to augment 3D visualization.","PeriodicalId":365320,"journal":{"name":"2006 IEEE International Workshop on Haptic Audio Visual Environments and their Applications (HAVE 2006)","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116957834","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Hand-writing Rehabilitation in the Haptic Virtual Environment","authors":"Youn K. Kim, Xiaoli Yang, Kimy","doi":"10.1109/HAVE.2006.283792","DOIUrl":"https://doi.org/10.1109/HAVE.2006.283792","url":null,"abstract":"In recent years, virtual reality (VR) has been applied to many medical fields. Rehabilitation with VR technologies is one of the emerging research directions. In this paper, a rehabilitation exercise - English character handwriting training exercise is described in the haptic virtual environment. The approach can be used to guide user's hand movement on the right track of the predefined trajectory according to the real-time guidance force. A haptic device, PHANTOM Premium 1.0, is used for writing English characters as a virtual pen, and also for giving the user force feedbacks as a guidance tool. The user with hand dysfunctions can be trained for precise hand movements through this writing rehabilitation interface. The experimental results and performance analysis are also given in the paper","PeriodicalId":365320,"journal":{"name":"2006 IEEE International Workshop on Haptic Audio Visual Environments and their Applications (HAVE 2006)","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122388098","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A. Hamam, S. Nourian, N. R. El-Far, F. Malric, Xiaojun Shen, N. Georganas
{"title":"A distributed, collaborative and haptic-enabled eye cataract surgery application with a user interface on desktop, stereo desktop and immersive displays","authors":"A. Hamam, S. Nourian, N. R. El-Far, F. Malric, Xiaojun Shen, N. Georganas","doi":"10.1109/HAVE.2006.283773","DOIUrl":"https://doi.org/10.1109/HAVE.2006.283773","url":null,"abstract":"In this paper, we discuss the technologies and approaches utilized in developing a cataract eye surgery simulation that will be used for training novice surgeons. The three different techniques described in this paper, are all hapto-visual techniques that resulted in three different implementations of the application: 2D simulation, 3D immersive simulation, and completely immersive simulation. The paper begins with an introduction and a general overview of the medical procedure of the cataract surgery. Then an overview of the eye and surgical tools modeling is given. Following that, the architecture and technology of each of the three design techniques is given. Finally the paper concludes with future work to improve the application","PeriodicalId":365320,"journal":{"name":"2006 IEEE International Workshop on Haptic Audio Visual Environments and their Applications (HAVE 2006)","volume":"52 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134460624","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Dark Matter Method for Correct Augmented Reality Occlusion Relationships","authors":"M. Fiala","doi":"10.1109/HAVE.2006.283554","DOIUrl":"https://doi.org/10.1109/HAVE.2006.283554","url":null,"abstract":"In augmented reality (AR), virtual objects are rendered to appear to co-exist with a real scene and real objects. Usually augmentations are simply drawn over top of the camera image, which does not work for cases where the virtual object should be partially or completely occluded by real objects in the scene. A solution is proposed where a virtual object that approximates the shape of the real object provides the correct occlusion relationship but is not drawn to the output image, an approach useful to both real time AR and off-line movie effects. This \"dark matter\" method is used in a \"magic mirror\" AR system","PeriodicalId":365320,"journal":{"name":"2006 IEEE International Workshop on Haptic Audio Visual Environments and their Applications (HAVE 2006)","volume":"56 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122525695","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Vibrotactile Perception: Differential Effects of Frequency, Amplitude, and Acceleration","authors":"Helena Pongrac","doi":"10.1109/HAVE.2006.283803","DOIUrl":"https://doi.org/10.1109/HAVE.2006.283803","url":null,"abstract":"High-frequency vibrations are an essential part of numerous manipulation tasks. A promising research area in particular are telemanipulation tasks where vibrations occurring in the remote environment are fed back through tactile displays. Two experiments concerning the perception of vibrations were conducted. The first experiment aims at determining whether vibrations are coded primarily by frequency, amplitude, or acceleration by human subjects. Results show that primarily frequency and amplitude, but not acceleration of the vibrations were perceived. In the second experiment, subjects' just noticeable difference (JND) for frequency under different conditions was examined. The resulting JND of 18% for frequencies showed neither dependence on amplitude or acceleration, which were independently held constant, nor on the reference frequencies. Therefore, it is not necessary to adjust the subjective intensity for vibrations for each human operator when designing tactile displays","PeriodicalId":365320,"journal":{"name":"2006 IEEE International Workshop on Haptic Audio Visual Environments and their Applications (HAVE 2006)","volume":"16 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128000591","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A 3D Annotation Interface Using the DIVINE Visual Display","authors":"K. Osman, F. Malric, S. Shirmohammadi","doi":"10.1109/HAVE.2006.283798","DOIUrl":"https://doi.org/10.1109/HAVE.2006.283798","url":null,"abstract":"While systems such as CAVEs have been experimented with and used for a number of years, their deployment has been slow mainly due to their expense and space requirements. As such, researchers have been moving towards smaller and cheaper immersive systems. In this paper, we introduce an immersive interface for manipulating 3D objects using the DIVINE system","PeriodicalId":365320,"journal":{"name":"2006 IEEE International Workshop on Haptic Audio Visual Environments and their Applications (HAVE 2006)","volume":"61 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114810668","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A Novel Method for Supporting Massively Multi-user Virtual Environments","authors":"D. Ahmed, S. Shirmohammadi, J. de Oliveira","doi":"10.1109/HAVE.2006.283807","DOIUrl":"https://doi.org/10.1109/HAVE.2006.283807","url":null,"abstract":"In collaborative distributed virtual environment people interact with each other to shard their states. In this paper we present massively multi-user virtual simulation architecture - MMVISA. The framework partitions the simulation platform into multiple regions to properly organize the decorative entities and to efficiently manage their association. The coordinator, the leader of a zone, manages local communications in multicast fashion but this multicast functionality is shifted from the network layer to the application layer to get the benefit of the scalability and easy deployability. Considering the behavior of the entities, coordinator opens multiple multicast channels to reduce structural reformation events among the entities. On the other hand, coordinators themselves form a top level mesh hierarchy to manage the area of interest. A mathematical model is given to determine the best node in the best zone with a given interest vector. This gives a provision to a node to import all the interesting messages when needed and makes it easy to be virtually there","PeriodicalId":365320,"journal":{"name":"2006 IEEE International Workshop on Haptic Audio Visual Environments and their Applications (HAVE 2006)","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116419738","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
J. Mora, Won-sook Lee, G. Comeau, S. Shirmohammadi, A. El Saddik
{"title":"Assisted Piano Pedagogy through 3D Visualization of Piano Playing","authors":"J. Mora, Won-sook Lee, G. Comeau, S. Shirmohammadi, A. El Saddik","doi":"10.1109/HAVE.2006.283791","DOIUrl":"https://doi.org/10.1109/HAVE.2006.283791","url":null,"abstract":"Having a correct posture at the piano requires a lot of practice to master; however, a visual feedback can help students realize potential problems and adopt a better position when playing the instrument. This paper discusses an innovative application of the techniqes used for the 3D visualization of piano performances in any possible view, with the purpose of comparing them to an ideal piano playing. It includes the capture and reconstruction of the 3D motion and posture of a professional piano player so that it may be compared against the posture and movements of students, by overlaying 2D videos of their recital. Issues with respect to displaying data and adding useful interface features are also discussed. The motion can be adjusted to fit measurements of anthropometrical different students, so that learners can see themselves and better understand what position they should adopt while they are at the piano","PeriodicalId":365320,"journal":{"name":"2006 IEEE International Workshop on Haptic Audio Visual Environments and their Applications (HAVE 2006)","volume":"38 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117227974","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Towards a Standard Modeling of Haptic Software System","authors":"A. Alamri, M. Eid, A. El Saddik","doi":"10.1109/HAVE.2006.283552","DOIUrl":"https://doi.org/10.1109/HAVE.2006.283552","url":null,"abstract":"Computer haptics refers to the discipline concerned with generating and rendering haptic stimuli to the human user. The last decade has witnessed a rapid progress in haptic applications software development. We envision a need for a standard for haptic application software modeling. This paper introduces the approach of the Unified Modeling Language based haptic software engineering. We present the rationale and a reference model for haptic software development, and propose the basic modeling technique that comprises modeling elements, notation, and methods for haptic software systems. A startup systematic engineering process that describes how a haptic software system could be developed is also presented. Finally, we summarize our findings and provide vision for future work","PeriodicalId":365320,"journal":{"name":"2006 IEEE International Workshop on Haptic Audio Visual Environments and their Applications (HAVE 2006)","volume":"71 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126613005","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}