Riccardo Rosati , Paolo Senesi , Barbara Lonzi , Adriano Mancini , Marco Mandolini
{"title":"An automated CAD-to-XR framework based on generative AI and Shrinkwrap modelling for a User-Centred design approach","authors":"Riccardo Rosati , Paolo Senesi , Barbara Lonzi , Adriano Mancini , Marco Mandolini","doi":"10.1016/j.aei.2024.102848","DOIUrl":null,"url":null,"abstract":"<div><div>CAD-to-XR is the workflow to generate interactive Photorealistic Virtual Prototypes (iPVPs) for Extended Reality (XR) apps from Computer-Aided Design (CAD) models. This process entails modelling, texturing, and XR programming. In the literature, no automatic CAD-to-XR frameworks simultaneously manage CAD simplification and texturing. There are no examples of their adoption for User-Centered Design (UCD). Moreover, such CAD-to-XR workflows do not seize the potentialities of generative algorithms to produce synthetic images (textures). The paper presents a framework for implementing the CAD-to-XR workflow. The solution consists of a module for texture generation based on Generative Adversarial Networks (GANs). The generated texture is then managed by another module (based on Shrinkwrap modelling) to develop the iPVP by simplifying the 3D model and UV mapping the generated texture. The geometric and material data is integrated into a graphic engine, which allows for programming an interactive experience with the iPVP in XR. The CAD-to-XR framework was validated on two components (rifle stock and forend) of a sporting rifle. The solution can automate the texturing process of different product versions in shorter times (compared to a manual procedure). After each product revision, it avoids tedious and manual activities required to generate a new iPVP. The image quality metrics highlight that images are generated in a “realistic” manner (the perceived quality of generated textures is highly comparable to real images). The quality of the iPVPs, generated through the proposed framework and visualised by users through a mixed reality head-mounted display, is equivalent to traditionally designed prototypes.</div></div>","PeriodicalId":50941,"journal":{"name":"Advanced Engineering Informatics","volume":"62 ","pages":"Article 102848"},"PeriodicalIF":8.0000,"publicationDate":"2024-09-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Advanced Engineering Informatics","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1474034624004968","RegionNum":1,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
CAD-to-XR is the workflow to generate interactive Photorealistic Virtual Prototypes (iPVPs) for Extended Reality (XR) apps from Computer-Aided Design (CAD) models. This process entails modelling, texturing, and XR programming. In the literature, no automatic CAD-to-XR frameworks simultaneously manage CAD simplification and texturing. There are no examples of their adoption for User-Centered Design (UCD). Moreover, such CAD-to-XR workflows do not seize the potentialities of generative algorithms to produce synthetic images (textures). The paper presents a framework for implementing the CAD-to-XR workflow. The solution consists of a module for texture generation based on Generative Adversarial Networks (GANs). The generated texture is then managed by another module (based on Shrinkwrap modelling) to develop the iPVP by simplifying the 3D model and UV mapping the generated texture. The geometric and material data is integrated into a graphic engine, which allows for programming an interactive experience with the iPVP in XR. The CAD-to-XR framework was validated on two components (rifle stock and forend) of a sporting rifle. The solution can automate the texturing process of different product versions in shorter times (compared to a manual procedure). After each product revision, it avoids tedious and manual activities required to generate a new iPVP. The image quality metrics highlight that images are generated in a “realistic” manner (the perceived quality of generated textures is highly comparable to real images). The quality of the iPVPs, generated through the proposed framework and visualised by users through a mixed reality head-mounted display, is equivalent to traditionally designed prototypes.
期刊介绍:
Advanced Engineering Informatics is an international Journal that solicits research papers with an emphasis on 'knowledge' and 'engineering applications'. The Journal seeks original papers that report progress in applying methods of engineering informatics. These papers should have engineering relevance and help provide a scientific base for more reliable, spontaneous, and creative engineering decision-making. Additionally, papers should demonstrate the science of supporting knowledge-intensive engineering tasks and validate the generality, power, and scalability of new methods through rigorous evaluation, preferably both qualitatively and quantitatively. Abstracting and indexing for Advanced Engineering Informatics include Science Citation Index Expanded, Scopus and INSPEC.