Virtual RealityPub Date : 2024-08-02DOI: 10.1007/s10055-024-01040-w
Inoussa Ouedraogo, Huyen Nguyen, Patrick Bourdot
{"title":"Immersive analytics with augmented reality in meteorology: an exploratory study on ontology and linked data","authors":"Inoussa Ouedraogo, Huyen Nguyen, Patrick Bourdot","doi":"10.1007/s10055-024-01040-w","DOIUrl":"https://doi.org/10.1007/s10055-024-01040-w","url":null,"abstract":"<p>Although Augmented Reality (AR) has been extensively studied in supporting Immersive Analytics (IA), there are still many challenges in visualising and interacting with big and complex datasets. To deal with these datasets, most AR applications utilise NoSQL databases for storing and querying data, especially for managing large volumes of unstructured or semi-structured data. However, NoSQL databases have limitations in their reasoning and inference capabilities, which can result in insufficient support for certain types of queries. To fill this gap, we aim to explore and evaluate whether an intelligent approach based on ontology and linked data can facilitate visual analytics tasks with big datasets on AR interface. We designed and implemented a prototype of this method for meteorological data analytics. An experiment was conducted to evaluate the use of a semantic database with linked data compared to a conventional approach in an AR-based immersive analytics system. The results significantly highlight the performance of semantic approach in helping the users analysing meteorological datasets and their subjective appreciation in working with the AR interface, which is enhanced with ontology and linked data.</p>","PeriodicalId":23727,"journal":{"name":"Virtual Reality","volume":"14 1","pages":""},"PeriodicalIF":4.2,"publicationDate":"2024-08-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141881479","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Virtual RealityPub Date : 2024-08-01DOI: 10.1007/s10055-024-01039-3
Woojin Cho, Taewook Ha, Ikbeom Jeon, Jinwoo Jeon, Tae-Kyun Kim, Woontack Woo
{"title":"Temporally enhanced graph convolutional network for hand tracking from an egocentric camera","authors":"Woojin Cho, Taewook Ha, Ikbeom Jeon, Jinwoo Jeon, Tae-Kyun Kim, Woontack Woo","doi":"10.1007/s10055-024-01039-3","DOIUrl":"https://doi.org/10.1007/s10055-024-01039-3","url":null,"abstract":"<p>We propose a robust 3D hand tracking system in various hand action environments, including hand-object interaction, which utilizes a single color image and a previous pose prediction as input. We observe that existing methods deterministically exploit temporal information in motion space, failing to address realistic diverse hand motions. Also, prior methods paid less attention to efficiency as well as robust performance, i.e., the balance issues between time and accuracy. The Temporally Enhanced Graph Convolutional Network (TE-GCN) utilizes a 2-stage framework to encode temporal information adaptively. The system establishes balance by adopting an adaptive GCN, which effectively learns the spatial dependency between hand mesh vertices. Furthermore, the system leverages the previous prediction by estimating the relevance across image features through the attention mechanism. The proposed method achieves state-of-the-art balanced performance on challenging benchmarks and demonstrates robust results on various hand motions in real scenes. Moreover, the hand tracking system is integrated into a recent HMD with an off-loading framework, achieving a real-time framerate while maintaining high performance. Our study improves the usability of a high-performance hand-tracking method, which can be generalized to other algorithms and contributes to the usage of HMD in everyday life. Our code with the HMD project will be available at https://github.com/UVR-WJCHO/TEGCN_on_Hololens2.</p>","PeriodicalId":23727,"journal":{"name":"Virtual Reality","volume":"2019 1","pages":""},"PeriodicalIF":4.2,"publicationDate":"2024-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141866649","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Virtual RealityPub Date : 2024-07-27DOI: 10.1007/s10055-024-01041-9
Ruowei Xiao, Rongzheng Zhang, Oğuz Buruk, Juho Hamari, Johanna Virkki
{"title":"Toward next generation mixed reality games: a research through design approach","authors":"Ruowei Xiao, Rongzheng Zhang, Oğuz Buruk, Juho Hamari, Johanna Virkki","doi":"10.1007/s10055-024-01041-9","DOIUrl":"https://doi.org/10.1007/s10055-024-01041-9","url":null,"abstract":"<p>Mixed reality (MR) games refer to games that integrate physical entities with digitally mediated contents. Currently, it entails game creators to integrate heterogeneous virtual and physical components, which is often time-consuming and labor-intensive, without the support of a coherent technology stack. The underlying technodiversity manifested by the research corpus suggests a complicated, multi-dimensional design space that goes beyond merely technical concerns. In this research, we adopted a research-through-design approach and proposed an MR game technology stack that facilitates flexible, low-code game development. As design grounding, we first surveyed 34 state-of-the-art studies, and results were synergized into three different spectra of technological affordances, respectively activity range, user interface and feedback control, to inform our next design process. We then went through an iterative prototyping phase and implemented an MR game development toolset. A co-design workshop was conducted, where we invited 15 participants to try the prototype tools and co-ideate the potential use scenarios for the proposed technology stack. First-hand user feedback was collected via questionnaires and semi-structured interviews. As a result, four conceptual game designs with three major design implications were generated, which conjointly reflect a broader understanding on MR gameful experience and contribute fresh insights to this emerging research domain.</p>","PeriodicalId":23727,"journal":{"name":"Virtual Reality","volume":"63 1","pages":""},"PeriodicalIF":4.2,"publicationDate":"2024-07-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141781336","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Virtual RealityPub Date : 2024-07-24DOI: 10.1007/s10055-024-01033-9
Ali Buwaider, Victor Gabriel El-Hajj, Alessandro Iop, Mario Romero, Walter C Jean, Erik Edström, Adrian Elmi-Terander
{"title":"Augmented reality navigation in external ventricular drain insertion—a systematic review and meta-analysis","authors":"Ali Buwaider, Victor Gabriel El-Hajj, Alessandro Iop, Mario Romero, Walter C Jean, Erik Edström, Adrian Elmi-Terander","doi":"10.1007/s10055-024-01033-9","DOIUrl":"https://doi.org/10.1007/s10055-024-01033-9","url":null,"abstract":"<p>External ventricular drain (EVD) insertion using the freehand technique is often associated with misplacements resulting in unfavorable outcomes. Augmented Reality (AR) has been increasingly used to complement conventional neuronavigation. The accuracy of AR guided EVD insertion has been investigated in several studies, on anthropomorphic phantoms, cadavers, and patients. This review aimed to assess the current knowledge and discuss potential benefits and challenges associated with AR guidance in EVD insertion. MEDLINE, EMBASE, and Web of Science were searched from inception to August 2023 for studies evaluating the accuracy of AR guidance for EVD insertion. Studies were screened for eligibility and accuracy data was extracted. The risk of bias was assessed using the Cochrane Risk of Bias Tool and the quality of evidence was assessed using the Newcastle-Ottawa-Scale. Accuracy was reported either as the average deviation from target or according to the Kakarla grading system. Of the 497 studies retrieved, 14 were included for analysis. All included studies were prospectively designed. Insertions were performed on anthropomorphic phantoms, cadavers, or patients, using several different AR devices and interfaces. Deviation from target ranged between 0.7 and 11.9 mm. Accuracy according to the Kakarla grading scale ranged between 82 and 96%. Accuracy was higher for AR compared to the freehand technique in all studies that had control groups. Current evidence demonstrates that AR is more accurate than free-hand technique for EVD insertion. However, studies are few, the technology developing, and there is a need for further studies on patients in relevant clinical settings.</p>","PeriodicalId":23727,"journal":{"name":"Virtual Reality","volume":"47 1","pages":""},"PeriodicalIF":4.2,"publicationDate":"2024-07-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141781331","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Virtual RealityPub Date : 2024-07-23DOI: 10.1007/s10055-024-01035-7
H. A. T. van Limpt-Broers, M. Postma, E. van Weelden, S. Pratesi, M. M. Louwerse
{"title":"Neurophysiological evidence for the overview effect: a virtual reality journey into space","authors":"H. A. T. van Limpt-Broers, M. Postma, E. van Weelden, S. Pratesi, M. M. Louwerse","doi":"10.1007/s10055-024-01035-7","DOIUrl":"https://doi.org/10.1007/s10055-024-01035-7","url":null,"abstract":"<p>The Overview Effect is a complex experience reported by astronauts after viewing Earth from space. Numerous accounts suggest that it leads to increased interconnectedness to other human beings and environmental awareness, comparable to self-transcendence. It can cause fundamental changes in mental models of the world, improved well-being, and stronger appreciation of, and responsibility for Earth. From a cognitive perspective, it is closely linked to the emotion of awe, possibly triggered by the overwhelming perceived vastness of the universe. Given that most research in the domain focuses on self-reports, little is known about potential neurophysiological markers of the Overview Effect. In the experiment reported here, participants viewed an immersive Virtual Reality simulation of a space journey while their brain activity was recorded using electroencephalography (EEG). Post-experimental self-reports confirmed they were able to experience the Overview Effect in the simulated environment. EEG recordings revealed lower spectral power in beta and gamma frequency bands during the defining moments of the Overview Effect. The decrease in spectral power can be associated with reduced mental processing, and a disruption of known mental structures in this context, thereby providing more evidence for the cognitive effects of the experience.</p>","PeriodicalId":23727,"journal":{"name":"Virtual Reality","volume":"245 1","pages":""},"PeriodicalIF":4.2,"publicationDate":"2024-07-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141781332","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Virtual RealityPub Date : 2024-07-19DOI: 10.1007/s10055-024-01027-7
Mantaj Singh, Peter Smitham, Suyash Jain, Christopher Day, Thomas Nijman, Dan George, David Neilly, Justin de Blasio, Michael Gilmore, Tiffany K. Gill, Susanna Proudman, Gavin Nimon
{"title":"Exploring the viability of Virtual Reality as a teaching method for knee aspiration","authors":"Mantaj Singh, Peter Smitham, Suyash Jain, Christopher Day, Thomas Nijman, Dan George, David Neilly, Justin de Blasio, Michael Gilmore, Tiffany K. Gill, Susanna Proudman, Gavin Nimon","doi":"10.1007/s10055-024-01027-7","DOIUrl":"https://doi.org/10.1007/s10055-024-01027-7","url":null,"abstract":"<p>Knee arthrocentesis is a simple procedure commonly performed by general practitioners and junior doctors. As such, doctors should be competent and comfortable in performing the technique by themselves; however, they need to be adequately trained. The best method to ensure practitioner proficiency is by optimizing teaching at an institutional level, thus, educating all future doctors in the procedure. However, the Coronavirus Disease 19 (COVID-19) pandemic caused significant disruption to hospital teaching for medical students which necessitated investigating the effectiveness of virtual reality (VR) as a platform to emulate hospital teaching of knee arthrocentesis. A workshop was conducted with 100 fourth year medical students divided into three Groups: A, B and C, each receiving a pre-reading online lecture. Group A was placed in an Objective Structured Clinical Examination (OSCE) station where they were assessed by a blinded orthopaedic surgeon using the OSCE assessment rubric. Group B undertook a hands-on practice station prior to assessment, while Group C received a VR video (courtesy of the University of Adelaide’s Health Simulation) in the form of VR headset or 360° surround immersion room and hands-on station followed by the OSCE. Upon completion of the workshop, students completed a questionnaire on their confidence with the procedure and the practicality of the VR station. OSCE scores were compared between Groups B and C to investigate the educational value of VR teaching. On average, students with VR headsets reported higher confidence with the procedure and were more inclined to undertake it on their own. Students in Group C who used the VR station prior to assessment scored higher than the non-VR Groups (Group A, 56%; Group B, 67%; Group C 83%). Students in Group A had statistically significant results on average compared to those in Group B (t(69) = 3.003, <i>p</i> = 0.003), as do students in Group B compared to Group C (t(62) = 5.400, <i>p</i> < 0.001). Within Group C students who were given VR headsets scored higher than immersion room students. The VR headset was beneficial in providing students with a representation of how knee arthrocentesis may be conducted in the hospital setting. While VR will not replace conventional in-hospital teaching, given current technological limitations, it serves as an effective teaching aid for arthrocentesis and has many other potential applications for a wide scope of medicine and surgical training.</p>","PeriodicalId":23727,"journal":{"name":"Virtual Reality","volume":"12 1","pages":""},"PeriodicalIF":4.2,"publicationDate":"2024-07-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141742636","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Virtual RealityPub Date : 2024-07-18DOI: 10.1007/s10055-024-01038-4
Dominik Spinczyk, Grzegorz Rosiak, Krzysztof Milczarek, Dariusz Konecki, Jarosław Żyłkowski, Jakub Franke, Maciej Pech, Karl Rohmer, Karol Zaczkowski, Ania Wolińska-Sołtys, Piotr Sperka, Dawid Hajda, Ewa Piętka
{"title":"Towards overcoming barriers to the clinical deployment of mixed reality image-guided navigation systems supporting percutaneous ablation of liver focal lesions","authors":"Dominik Spinczyk, Grzegorz Rosiak, Krzysztof Milczarek, Dariusz Konecki, Jarosław Żyłkowski, Jakub Franke, Maciej Pech, Karl Rohmer, Karol Zaczkowski, Ania Wolińska-Sołtys, Piotr Sperka, Dawid Hajda, Ewa Piętka","doi":"10.1007/s10055-024-01038-4","DOIUrl":"https://doi.org/10.1007/s10055-024-01038-4","url":null,"abstract":"<p>In recent years, we have observed a rise in the popularity of minimally invasive procedures for treating liver tumours, with percutaneous thermoablation being one of them, conducted using image-guided navigation systems with mixed reality technology. However, the application of this method requires adequate training in using the employed system. In our study, we assessed which skills pose the greatest challenges in performing such procedures. The article proposes a training module characterized by an innovative approach: the possibility of practicing the diagnosis, planning, execution stages and the physical possibility of performing the execution stage on the radiological phantom of the abdominal cavity. The proposed approach was evaluated by designing a set of 4 exercises corresponding to the 3 phases mentioned. To the research group included 10 radiologists and 5 residents in the study. Based on 20 clinical cases of liver tumors subjected to percutaneous thermoablation, we developed assessment tasks evaluating four skill categories: head-mounted display (HMD), ultrasound (US)/computed tomography (CT) image fusion interpretation, tracking system use, and the ability to insert a needle<b>.</b> The results were presented using the Likert scale. The results of our study indicate that the most challenging aspect for radiology specialists is adapting to HMD gesture control, while residents point to intraoperative images of fusion and respiratory movements in the liver as the most problematic. In terms of improving the ability to perform procedures on new patients, the module also allows you to create a new hologram for a different clinical case.</p>","PeriodicalId":23727,"journal":{"name":"Virtual Reality","volume":"70 1","pages":""},"PeriodicalIF":4.2,"publicationDate":"2024-07-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141742471","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Virtual RealityPub Date : 2024-07-10DOI: 10.1007/s10055-024-01031-x
Aleksandra Zheleva, Lieven De Marez, Durk Talsma, Klaas Bombeke
{"title":"Intersecting realms: a cross-disciplinary examination of VR quality of experience research","authors":"Aleksandra Zheleva, Lieven De Marez, Durk Talsma, Klaas Bombeke","doi":"10.1007/s10055-024-01031-x","DOIUrl":"https://doi.org/10.1007/s10055-024-01031-x","url":null,"abstract":"<p>The advent of virtual reality (VR) technology has necessitated a reevaluation of quality of experience (QoE) models. While numerous recent efforts have been dedicated to creating comprehensive QoE frameworks it seems that the majority of the factors studied as potential influencers of QoE are often limited to single disciplinary viewpoints or specific user-related aspects. Furthermore, the majority of literature reviews in this domain seem to have predominantly focused on academic sources, overlooking industry insights. To address these points, the current research took an interdisciplinary literature review approach to examine QoE literature covering both academic and industry sources from diverse fields (i.e., psychology, ergonomics, user experience, communication science, and engineering). Based on this rich dataset, we created a QoE model that illustrated 252 factors grouped into four branches - user, system, context, and content. The main finding of this review emphasized the substantial gap in the current research landscape, where complex interactions among user, system, context, and content factors in VR are overlooked. The current research not only identified this crucial disparity in existing QoE studies but also provided a substantial online repository of over 200 QoE-related factors. The repository serves as an indispensable tool for future researchers aiming to construct a more holistic understanding of QoE.</p>","PeriodicalId":23727,"journal":{"name":"Virtual Reality","volume":"36 1","pages":""},"PeriodicalIF":4.2,"publicationDate":"2024-07-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141586942","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Virtual RealityPub Date : 2024-07-09DOI: 10.1007/s10055-024-01032-w
Ashlee Gronowski, David Caelum Arness, Jing Ng, Zhonglin Qu, Chng Wei Lau, Daniel Catchpoole, Quang Vinh Nguyen
{"title":"The impact of virtual and augmented reality on presence, user experience and performance of Information Visualisation","authors":"Ashlee Gronowski, David Caelum Arness, Jing Ng, Zhonglin Qu, Chng Wei Lau, Daniel Catchpoole, Quang Vinh Nguyen","doi":"10.1007/s10055-024-01032-w","DOIUrl":"https://doi.org/10.1007/s10055-024-01032-w","url":null,"abstract":"<p>The fast growth of virtual reality (VR) and augmented reality (AR) head-mounted displays provides a new medium for interactive visualisations and visual analytics. Presence is the experience of consciousness within extended reality, and it has the potential to increase task performance. This project studies the impact that a sense of presence has on data visualisation performance and user experience under AR and VR conditions. A within-subjects design recruited 38 participants to complete interactive visualisation tasks within the novel immersive data analytics system for genomic data in AR and VR, and measured speed, accuracy, preference, presence, and user satisfaction. Open-ended user experience responses were also collected. The results implied that VR was more conducive to efficiency, effectiveness, and user experience as well as offering insight into possible cognitive load benefits for VR users.</p>","PeriodicalId":23727,"journal":{"name":"Virtual Reality","volume":"21 1","pages":""},"PeriodicalIF":4.2,"publicationDate":"2024-07-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141573938","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Virtual RealityPub Date : 2024-07-09DOI: 10.1007/s10055-024-01030-y
Xiaotian Zhang, Weiping He, Mark Billinghurst, Yunfei Qin, Lingxiao Yang, Daisong Liu, Zenglei Wang
{"title":"Usability of visualizing position and orientation deviations for manual precise manipulation of objects in augmented reality","authors":"Xiaotian Zhang, Weiping He, Mark Billinghurst, Yunfei Qin, Lingxiao Yang, Daisong Liu, Zenglei Wang","doi":"10.1007/s10055-024-01030-y","DOIUrl":"https://doi.org/10.1007/s10055-024-01030-y","url":null,"abstract":"<p>Manual precise manipulation of objects is an essential skill in everyday life, and Augmented Reality (AR) is increasingly being used to support such operations. In this study, we investigate whether detailed visualizations of position and orientation deviations are helpful for AR-assisted manual precise manipulation of objects. We developed three AR instructions with different visualizations of deviations: the logical deviation baseline instruction, the precise numerical deviations-based instruction, and the intuitive color-mapped deviations-based instruction. All three instructions visualized the required directions for manipulation and the logical values of whether the object met the accuracy requirements. Additionally, the latter two instructions provided detailed visualizations of deviations through numerical text and color-mapping respectively. A user study was conducted with 18 participants to compare the three AR instructions. The results showed that there were no significant differences found in speed, accuracy, perceived ease-of-use, and perceived workload between the three AR instructions. We found that the visualizations of the required directions for manipulation and the logical values of whether the object met the accuracy requirements were sufficient to guide manual precise manipulation. The detailed visualizations of the real-time deviations could not improve the speed and accuracy of manual precise manipulation, and although they could improve the perceived ease-of-use and user experience, the effects were not significant. Based on the results, several recommendations were provided for designing AR instructions to support precise manual manipulation.</p>","PeriodicalId":23727,"journal":{"name":"Virtual Reality","volume":"46 1","pages":""},"PeriodicalIF":4.2,"publicationDate":"2024-07-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141573937","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}