{"title":"Recent Advances in Computer Vision and Machine Learning for Athletic Performance in Jump Events","authors":"Arya Shah, Darshan Prajapati","doi":"10.1007/s41133-025-00087-x","DOIUrl":"10.1007/s41133-025-00087-x","url":null,"abstract":"<div><p>The application of machine learning and computer vision in athletics facilitates analytical tools for the triple jump and long jump, allowing for the monitoring of speed, posture, and the phases of take-off and landing. Prior research has focused on sports biomechanics; however, this work integrates computer vision and machine learning techniques for triple and long jump events. This study examines how sophisticated technology may transform conventional analytical methods concerning precision and efficiency in evaluating athletes' techniques. The investigation indicates that neural networks, RNNs, and CNNs surpassed traditional methods. This study examines the issues associated with advanced algorithms, including accuracy degradation with varying jumps, expenses related to video capture systems, and the ethical implications of contemporary technology. This work investigates computer vision and machine learning to improve athlete performance via comprehensive feedback, encompassing data acquisition through wearables and computer vision systems. This research facilitates the creation of prediction models for performance analysis and addresses dataset restrictions using machine learning approaches, including transfer learning. It examines AI-driven feedback mechanisms to enhance training efficiency. The research demonstrated that approach velocity directly affects leap distance. The categorization and forecasting of leaps using these techniques help coaches assess skills and adjust training regimens.</p></div>","PeriodicalId":100147,"journal":{"name":"Augmented Human Research","volume":"10 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2025-10-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145256700","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Mixed Reality for Human–Robot Teaming to Enhance Work Health and Safety in Manufacturing Industries: A Systematic Literature Review","authors":"Apurba Das, Azizur Rahman, Syed Tanvin Hossain, Rubaiat Ahmed, Mahmim Ara","doi":"10.1007/s41133-025-00085-z","DOIUrl":"10.1007/s41133-025-00085-z","url":null,"abstract":"<div><p>Mixed reality (MR) integrated with human–robot teaming (HRT) has emerged as a promising approach to address persistent challenges in work health and safety (WHS) within manufacturing. To evaluate its potential, we conducted a systematic review of 33 peer-reviewed studies published between 2015 and 2024, identified from databases indexed in Google Scholar (e.g., IEEE, Elsevier, Springer, MDPI). Studies were screened using predefined inclusion and exclusion criteria, and quality was appraised with the Mixed-Methods Appraisal Tool (MMAT). The synthesis highlights three major applications of MR in HRT for WHS: immersive training and ergonomic assessment, real-time hazard monitoring and visualization, and enhanced human–robot communication via intuitive interfaces and natural language processing. Reported benefits include faster skill acquisition, improved situational awareness, and reduced accident risks. However, key barriers remain—particularly cognitive overload, ergonomic discomfort, integration with legacy manufacturing systems, and limited longitudinal evidence. Despite these challenges, the review demonstrates that MR–HRT solutions can significantly strengthen WHS outcomes if designed with ergonomic validation, adaptive feedback mechanisms, and scalable deployment strategies. For manufacturing industries, the findings provide a practical roadmap: prioritize user-centered MR design, invest in real-world pilot implementations, and embed WHS outcomes into technology evaluation. Advancing MR–HRT beyond proof-of-concept will require interdisciplinary collaboration and rigorous validation, enabling safer, smarter, and more resilient manufacturing environments.</p></div>","PeriodicalId":100147,"journal":{"name":"Augmented Human Research","volume":"10 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2025-10-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://link.springer.com/content/pdf/10.1007/s41133-025-00085-z.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145210610","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Lucy Knöps, Alexander M. Wakker, Elise Lie, Bart Cornelissen, Abdullah Thabit, Mohamed Benmahdjoub, Theo van Walsum, Michael H. J. Verhofstad, Esther M. M. van Lieshout, Mark G. van Vledder
{"title":"Surgeons Experience with Head-Mounted Augmented Reality for Intra-articular Fractures in Orthopedic Trauma Surgery","authors":"Lucy Knöps, Alexander M. Wakker, Elise Lie, Bart Cornelissen, Abdullah Thabit, Mohamed Benmahdjoub, Theo van Walsum, Michael H. J. Verhofstad, Esther M. M. van Lieshout, Mark G. van Vledder","doi":"10.1007/s41133-025-00084-0","DOIUrl":"10.1007/s41133-025-00084-0","url":null,"abstract":"<div><p>Conventional 2D imaging in orthopedic trauma surgery lacks depth and requires attention shifts away from the operative field. Head-mounted augmented reality (AR HMDs) may improve intra-operative visualization by overlaying 3D holograms in the field of view. However, clinical evaluations focusing on surgeon experience remain limited. This study aimed to evaluate the usability and surgeon experience with AR HMD during intra-articular fracture surgery. A prospective single-center case series was conducted with ten orthopedic trauma surgeons who each completed a preclinical simulator session and then used a Microsoft HoloLens 2 to visualize patient-specific 3D models during 20 open reduction and internal fixation procedures. Outcomes: Simulator Sickness Questionnaire (SSQ; primary), Borg CR10 physical exertion, NASA-TLX mental workload, System Usability Scale (SUS), and a feasibility questionnaire. Across 20 procedures, SSQ indicated symptoms ranging from minimal to significant (preclinical mean 12.7, SD 16.2; intra-operative/postoperative mean 22.0, SD 20.7). Physical exertion was very low (Borg CR10 median 1.0, <i>P</i><sub>25</sub>–<i>P</i><sub>75</sub> 0–1). Mental demand was medium (NASA-TLX mean 23.0, SD 21.9). Usability was rated good (SUS mean 69.3, SD 14.0). Surgeons judged potential utility highest for complex trauma, revision cases, and osteotomies (feasibility means 73.0, 73.0, and 68.0, respectively). Overall satisfaction averaged 62.0 (SD 27.5), and willingness to reuse was high (median 80, <i>P</i><sub>25</sub>–<i>P</i><sub>75</sub> 55–87). Common challenges were gesture control and hologram positioning. Intra-operative AR use was feasible, with low physical exertion, medium mental demand, and good perceived usability, although simulator sickness symptoms were reported. Surgeons expressed willingness to reuse the system and identified greatest value in complex articular reconstructions. Findings reflect a small, single-center prospective case study using one AR HMD model and did not assess patient outcomes. Results support further multicenter evaluations focused on ergonomics, interaction design, workflow integration, and clinical impact.</p></div>","PeriodicalId":100147,"journal":{"name":"Augmented Human Research","volume":"10 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2025-09-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://link.springer.com/content/pdf/10.1007/s41133-025-00084-0.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145210550","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Declan Ikechukwu Emegano, Dilber Uzun Ozsahin, Berna Uzun, Ilker Ozsahin
{"title":"The Integration of Virtual Reality (VR), Augmented Reality (AR), and Artificial Intelligence in Revolutionizing Healthcare: A Systematic Review","authors":"Declan Ikechukwu Emegano, Dilber Uzun Ozsahin, Berna Uzun, Ilker Ozsahin","doi":"10.1007/s41133-025-00082-2","DOIUrl":"10.1007/s41133-025-00082-2","url":null,"abstract":"<div><p>Healthcare is experiencing rapid advancements due to the integration of virtual reality (VR), a computer-generated simulation that uses technology to generate an artificial environment; augmented reality (AR), a technology that augments the physical environment by superimposing digital content onto the actual world; and artificial intelligence (AI), which enables personalized diagnostics, immersive training, and improved patient care. This comprehensive review identified 1,075 records by conducting a search of Scopus, PubMed, and Science Direct for studies published from 2019 to 2024. A total of 37 research studies were evaluated following a thorough screening process that included the application of eligibility and exclusion criteria and the removal of duplicated studies following PRISMA regulations. The main findings indicate a significant increase in the overall number of publications, with the USA and the UK accounting for 51.3% of all publications because of their robust research machinery. Countries like Korea, Turkey, Australia, and Italy represented an overall 5.4 to 10.8%. The result of this review could be applied predominantly in telemedicine, educational institutions, rehabilitation, and surgical procedures, resulting in enhanced interaction between patients and operational precision. In summary, although VR, AR, and AI improve health-related education, therapy, and care, they need to tackle limitations such as expenditures and limitations in technology to achieve widespread adoption.</p></div>","PeriodicalId":100147,"journal":{"name":"Augmented Human Research","volume":"10 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2025-09-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145037340","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Computer Vision-Based Archery Optics","authors":"Atul Raj","doi":"10.1007/s41133-025-00083-1","DOIUrl":"10.1007/s41133-025-00083-1","url":null,"abstract":"<div><p>Today, archery is used in sports, hunting, recreational shooting, movies, etc. Traditional sights can lose alignment due to vibration and require tools for zeroing, are time-consuming for adjustment, difficult to see in certain backgrounds, and are affected by parallax. To overcome these challenges, a computer vision-based aiming application for smartphones was developed. The features of this application are digital aiming, zeroing, and arrow drop zeroing without any tools, background-based reticle color inversion, sensor-based incline level indicator, zoom, and zeroed distance autosave. These features aim to improve visibility, ease of adjustment, and aiming without any additional cost. Next, the performance of the sight system was tested by an archer firing arrows at a 50 cm target. A total number of 37 shots were fired outside on 3 days, early morning. By using the new sight, a mean absolute error of 10.85 on day 1, 7.18 on day 2, and 6.25 on day 3 was obtained. The study was limited by a small sample size due to difficulty in finding another skilled archer, as archery is not a common sport and has a huge learning curve. The current study identifies the practicality and efficiency of computer vision-based augmentation, like digital aiming, fast zeroing, and better visibility. Additionally, in future, other studies can work on the use of AI, ML, and sensor-based wind direction prediction in a smartphone application.</p></div>","PeriodicalId":100147,"journal":{"name":"Augmented Human Research","volume":"10 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2025-09-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144998388","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Taste Augmentation of Wine by Artificial Climate Room: Influence of Temperature and Humidity on Taste Evaluation","authors":"Toshiharu Igarashi, Yoichi Ochiai","doi":"10.1007/s41133-025-00081-3","DOIUrl":"10.1007/s41133-025-00081-3","url":null,"abstract":"<div><p>This study investigates the effects of temperature and humidity on the subjective characteristics of wine through evaluations in two distinct environments: an artificial climate chamber and a conference room. Two wines, wine 1(CROIX DE BEAUCAILLOU 2011) and wine 2(BLAGNY 1ER CRU LA PIECE SOUS LE BOIS 2014), were analyzed. Significant differences in color intensity, aging degree, and body were observed for wine 1 in the artificial climate room, and in flavor intensity in the conference room. For wine 2, significant differences were detected in flavor intensity and aging degree in the artificial climate room. Additionally, the composition changes of the wines concerning temperature were examined, revealing correlations between specific acids and temperature changes. These findings indicate that wine taste can be optimized by adjusting environmental conditions based on wine type and personal preferences, suggesting the potential for climate-controlled environments in enhancing wine and food experiences in restaurants.</p></div>","PeriodicalId":100147,"journal":{"name":"Augmented Human Research","volume":"10 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2025-02-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://link.springer.com/content/pdf/10.1007/s41133-025-00081-3.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143373343","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Haptic Gamer Suit for Enhancing VR Games Experience","authors":"Sathonkorn Saladtook, Phumiphat Rujirotthamrong, Cherapa Eiwaroon, Jirutchaya Phunpar, Chattaporn Saladtook, Chutisant Kerdvibulvech","doi":"10.1007/s41133-024-00079-3","DOIUrl":"10.1007/s41133-024-00079-3","url":null,"abstract":"<div><p>In recent times, virtual reality (VR) games have gained immense popularity, captivating gamers worldwide. While VR games offer an immersive experience, they do have certain limitations, particularly when it comes to the utilization of hand controllers. This research endeavors to overcome these limitations and provide players with an enhanced and realistic gaming experience by introducing a novel accessory called the Haptic Gamer Suit (H-Suit), designed to complement VR headsets. This paper presents the development of the H-Suit, achieved through an exploration of haptic technologies and conducting in-depth interviews with gaming experts. The H-Suit is a comprehensive outfit comprising five components: a shirt, pants, belt, gloves, and socks, all seamlessly integrated with the VR headset. Embedded within the suit are sensors and circuit boards, meticulously engineered to simulate the sensations experienced by game characters when they encounter damage. By employing the H-Suit, users can engage in VR games without the need for handheld controllers, enabling a heightened level of realism throughout their gaming sessions. Through extensive research in haptic technologies and insights gained from expert gamers, the H-Suit has been conceptualized and brought to fruition. Its seamless integration with the VR headset ensures effortless connectivity, and the incorporation of sensors and circuit boards in the suit delivers a tangible and authentic gaming experience. As a result, players can engage in VR games without the constraints of holding controllers, thereby immersing themselves in a world that feels remarkably lifelike, courtesy of the H-Suit.</p></div>","PeriodicalId":100147,"journal":{"name":"Augmented Human Research","volume":"10 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-11-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142645406","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Kunjal Ahir, Kajal Govani, Rutvik Gajera, Manan Shah
{"title":"Retraction Note: Application on Virtual Reality for Enhanced Education Learning, Military Training and Sports","authors":"Kunjal Ahir, Kajal Govani, Rutvik Gajera, Manan Shah","doi":"10.1007/s41133-024-00076-6","DOIUrl":"10.1007/s41133-024-00076-6","url":null,"abstract":"","PeriodicalId":100147,"journal":{"name":"Augmented Human Research","volume":"10 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-11-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142636858","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The Impact of Transferring Embodiment and Work Efficiency Between Natural Body and Modular Body Systems","authors":"Vitvasin Vimolmongkolporn, Yukiko Iwasaki, Fumihiro Kato, Hiroyasu Iwata","doi":"10.1007/s41133-024-00078-4","DOIUrl":"10.1007/s41133-024-00078-4","url":null,"abstract":"<div><p>Human augmentation technology, particularly supernumerary robotic limbs, has seen rapid growth and offers promising applications. However, the cognitive aspects of supernumerary robotic limbs, such as the sense of embodiment, remain underexplored in the context of modular body systems that which provide the part of one’s own body feeling, especially when detached, and how this might impact work efficiency. This study aims to investigate the impact of experience of synchronizing Sense of Embodiment and work efficiency between the user’s innate body and a modular body system. The experiment was conducted using a modular body prototype and compared between with wearing experience (the user is more likely to perceive this robot as a part of their body) and without wearing experience (the user is more likely to perceive this robot as a separate, standalone robot). Objective evaluations included task completion time and accuracy, while the sense of embodiment questionnaire was employed for subjective evaluations. The results suggested that after having experience of wearing modular arm for a while, this experience can influence the work efficiency even if it was used without wearing it.</p></div>","PeriodicalId":100147,"journal":{"name":"Augmented Human Research","volume":"10 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-11-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://link.springer.com/content/pdf/10.1007/s41133-024-00078-4.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142636903","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Smart Life Saver Jacket: A New Jacket to Support CPR Operation","authors":"Thayita Chantarutai, Piyachat Klinthai, Pimpakarn A-masiri, Chutisant Kerdvibulvech","doi":"10.1007/s41133-024-00080-w","DOIUrl":"10.1007/s41133-024-00080-w","url":null,"abstract":"<div><p>Cardiac arrest is common death these days. Most patients do not notice the symptoms before it happens. Death or severe consequence and be prevented if help and proper assistance can be reached within time. Since a cardiac arrest is prone to get higher continuously, some buildings have installed AED (automated external defibrillator) defibrillators. Therefore, patients can reach for help immediately. But the fact is that some of the helpers can reach a patient within time but they are not sure how to operate heart stimulation by AED or CPR (cardiopulmonary resuscitation) properly nor do they make a decision to push the patient’s chest. In Thailand, CPR training is just an option; it is not a compulsory lesson. Nevertheless, CPR trainees can obtain only the theoretical lesson; then, they do not know exactly how hard to push the patient’s chest. This leads to misoperation when they face the real incident. Another factor is when an incident occurs, helpers do not know how to contact emergency and do not know what important information they need to provide to medical support. In this paper, we develop a smart life saver jacket to support the helper in how perform accurate CPR by using machine learning technology to detect the patient’s pulse and support the helper to make decisions combined with an interface idea to indicate helper how to perform accurate CPR, while it is the application that will connect to the nearest hospital.</p></div>","PeriodicalId":100147,"journal":{"name":"Augmented Human Research","volume":"10 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-11-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142600668","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}