{"title":"增强现实技术和跨设备互动技术实现物理和数字科学论文的无缝整合","authors":"Md Ochiuddin Miah, Jun Kong","doi":"10.1101/2024.02.05.578116","DOIUrl":null,"url":null,"abstract":"Researchers face the challenge of efficiently navigating vast scientific literature while valuing printed papers in the digital age. Printed materials facilitate deeper engagement and comprehension, leading to superior exam performance and enhanced retention. However, existing digital tools often need to pay more attention to the needs of researchers who value the tactile benefits of printed documents. In response to this gap, we introduce AR-PaperSync, a transformative solution that leverages Augmented Reality (AR) and cross-device interaction technology. AR-PaperSync seamlessly integrates the physical experience of printed papers with the interactive capabilities of digital tools. Researchers can effortlessly navigate inline citations, manage saved references, and synchronize reading notes across mobile, desktop, and printed paper formats. Our user-centric approach, informed by in-depth interviews with six researchers, ensures that AR-PaperSync is tailored to its target users' needs. A comprehensive user study involving 28 participants evaluated AR-PaperSync's significantly improved efficiency, accuracy, and cognitive load in academic reading tasks compared to conventional methods. These findings suggest that AR-PaperSync enhances the reading experience of printed scientific papers and provides a seamless integration of physical and digital reading environments for researchers.","PeriodicalId":501568,"journal":{"name":"bioRxiv - Scientific Communication and Education","volume":"18 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-02-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Augmented Reality and Cross-Device Interaction for Seamless Integration of Physical and Digital Scientific Papers\",\"authors\":\"Md Ochiuddin Miah, Jun Kong\",\"doi\":\"10.1101/2024.02.05.578116\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Researchers face the challenge of efficiently navigating vast scientific literature while valuing printed papers in the digital age. Printed materials facilitate deeper engagement and comprehension, leading to superior exam performance and enhanced retention. However, existing digital tools often need to pay more attention to the needs of researchers who value the tactile benefits of printed documents. In response to this gap, we introduce AR-PaperSync, a transformative solution that leverages Augmented Reality (AR) and cross-device interaction technology. AR-PaperSync seamlessly integrates the physical experience of printed papers with the interactive capabilities of digital tools. Researchers can effortlessly navigate inline citations, manage saved references, and synchronize reading notes across mobile, desktop, and printed paper formats. Our user-centric approach, informed by in-depth interviews with six researchers, ensures that AR-PaperSync is tailored to its target users' needs. A comprehensive user study involving 28 participants evaluated AR-PaperSync's significantly improved efficiency, accuracy, and cognitive load in academic reading tasks compared to conventional methods. These findings suggest that AR-PaperSync enhances the reading experience of printed scientific papers and provides a seamless integration of physical and digital reading environments for researchers.\",\"PeriodicalId\":501568,\"journal\":{\"name\":\"bioRxiv - Scientific Communication and Education\",\"volume\":\"18 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-02-08\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"bioRxiv - Scientific Communication and Education\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1101/2024.02.05.578116\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"bioRxiv - Scientific Communication and Education","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1101/2024.02.05.578116","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Augmented Reality and Cross-Device Interaction for Seamless Integration of Physical and Digital Scientific Papers
Researchers face the challenge of efficiently navigating vast scientific literature while valuing printed papers in the digital age. Printed materials facilitate deeper engagement and comprehension, leading to superior exam performance and enhanced retention. However, existing digital tools often need to pay more attention to the needs of researchers who value the tactile benefits of printed documents. In response to this gap, we introduce AR-PaperSync, a transformative solution that leverages Augmented Reality (AR) and cross-device interaction technology. AR-PaperSync seamlessly integrates the physical experience of printed papers with the interactive capabilities of digital tools. Researchers can effortlessly navigate inline citations, manage saved references, and synchronize reading notes across mobile, desktop, and printed paper formats. Our user-centric approach, informed by in-depth interviews with six researchers, ensures that AR-PaperSync is tailored to its target users' needs. A comprehensive user study involving 28 participants evaluated AR-PaperSync's significantly improved efficiency, accuracy, and cognitive load in academic reading tasks compared to conventional methods. These findings suggest that AR-PaperSync enhances the reading experience of printed scientific papers and provides a seamless integration of physical and digital reading environments for researchers.