Ziteng Liu, Wenpeng Gao, Yu Sun, Yixian Su, Jiahua Zhu, Lubing Xu, Yili Fu
{"title":"一种用于探索人体解剖的非接触式交互式立体显示系统。","authors":"Ziteng Liu, Wenpeng Gao, Yu Sun, Yixian Su, Jiahua Zhu, Lubing Xu, Yili Fu","doi":"10.1080/24699322.2018.1560083","DOIUrl":null,"url":null,"abstract":"Stereoscopic display based on Virtual Reality (VR) can facilitate doctors to observe the 3 D virtual anatomical models with the depth cues, assist them in intuitively investigating the spatial relationship between different anatomical structures without mental imagination. However, there is few input device can be used in controlling the virtual anatomical models in the sterile operating room. This paper presents a cost-effective VR application system for demonstration of 3 D virtual anatomical models with non-contact interaction and stereo display. The system is integrated with hand gesture interaction and voice interaction to achieve non-contact interaction. Hand gesture interaction is implemented based on a Leap Motion controller mounted on the Oculus Rift DK2. Voice is converted into operation using Bing Speech for English language and Aitalk for Chinese language, respectively. A local relationship database is designed to record the anatomical terminologies to speech recognition engine to query these uncommon words. The hierarchical nature of these terminologies is also recorded in a tree structure. In the experiments, ten participants were asked to perform the evaluation on the proposed system. The results show that our system is more efficient than traditional interactive manner and verify the feasibility and practicability in the sterile operating room.","PeriodicalId":56051,"journal":{"name":"Computer Assisted Surgery","volume":null,"pages":null},"PeriodicalIF":1.5000,"publicationDate":"2019-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1080/24699322.2018.1560083","citationCount":"0","resultStr":"{\"title\":\"A non-contact interactive stereo display system for exploring human anatomy.\",\"authors\":\"Ziteng Liu, Wenpeng Gao, Yu Sun, Yixian Su, Jiahua Zhu, Lubing Xu, Yili Fu\",\"doi\":\"10.1080/24699322.2018.1560083\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Stereoscopic display based on Virtual Reality (VR) can facilitate doctors to observe the 3 D virtual anatomical models with the depth cues, assist them in intuitively investigating the spatial relationship between different anatomical structures without mental imagination. However, there is few input device can be used in controlling the virtual anatomical models in the sterile operating room. This paper presents a cost-effective VR application system for demonstration of 3 D virtual anatomical models with non-contact interaction and stereo display. The system is integrated with hand gesture interaction and voice interaction to achieve non-contact interaction. Hand gesture interaction is implemented based on a Leap Motion controller mounted on the Oculus Rift DK2. Voice is converted into operation using Bing Speech for English language and Aitalk for Chinese language, respectively. A local relationship database is designed to record the anatomical terminologies to speech recognition engine to query these uncommon words. The hierarchical nature of these terminologies is also recorded in a tree structure. In the experiments, ten participants were asked to perform the evaluation on the proposed system. The results show that our system is more efficient than traditional interactive manner and verify the feasibility and practicability in the sterile operating room.\",\"PeriodicalId\":56051,\"journal\":{\"name\":\"Computer Assisted Surgery\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":1.5000,\"publicationDate\":\"2019-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://sci-hub-pdf.com/10.1080/24699322.2018.1560083\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Computer Assisted Surgery\",\"FirstCategoryId\":\"3\",\"ListUrlMain\":\"https://doi.org/10.1080/24699322.2018.1560083\",\"RegionNum\":4,\"RegionCategory\":\"医学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"SURGERY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computer Assisted Surgery","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.1080/24699322.2018.1560083","RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"SURGERY","Score":null,"Total":0}
A non-contact interactive stereo display system for exploring human anatomy.
Stereoscopic display based on Virtual Reality (VR) can facilitate doctors to observe the 3 D virtual anatomical models with the depth cues, assist them in intuitively investigating the spatial relationship between different anatomical structures without mental imagination. However, there is few input device can be used in controlling the virtual anatomical models in the sterile operating room. This paper presents a cost-effective VR application system for demonstration of 3 D virtual anatomical models with non-contact interaction and stereo display. The system is integrated with hand gesture interaction and voice interaction to achieve non-contact interaction. Hand gesture interaction is implemented based on a Leap Motion controller mounted on the Oculus Rift DK2. Voice is converted into operation using Bing Speech for English language and Aitalk for Chinese language, respectively. A local relationship database is designed to record the anatomical terminologies to speech recognition engine to query these uncommon words. The hierarchical nature of these terminologies is also recorded in a tree structure. In the experiments, ten participants were asked to perform the evaluation on the proposed system. The results show that our system is more efficient than traditional interactive manner and verify the feasibility and practicability in the sterile operating room.
期刊介绍:
omputer Assisted Surgery aims to improve patient care by advancing the utilization of computers during treatment; to evaluate the benefits and risks associated with the integration of advanced digital technologies into surgical practice; to disseminate clinical and basic research relevant to stereotactic surgery, minimal access surgery, endoscopy, and surgical robotics; to encourage interdisciplinary collaboration between engineers and physicians in developing new concepts and applications; to educate clinicians about the principles and techniques of computer assisted surgery and therapeutics; and to serve the international scientific community as a medium for the transfer of new information relating to theory, research, and practice in biomedical imaging and the surgical specialties.
The scope of Computer Assisted Surgery encompasses all fields within surgery, as well as biomedical imaging and instrumentation, and digital technology employed as an adjunct to imaging in diagnosis, therapeutics, and surgery. Topics featured include frameless as well as conventional stereotactic procedures, surgery guided by intraoperative ultrasound or magnetic resonance imaging, image guided focused irradiation, robotic surgery, and any therapeutic interventions performed with the use of digital imaging technology.