Teemu Mäkiaho, Jouko Laitinen, Mikael Nuutila, Kari T. Koskinen
{"title":"Remaining useful lifetime prediction for milling blades using a fused data prediction model (FDPM)","authors":"Teemu Mäkiaho, Jouko Laitinen, Mikael Nuutila, Kari T. Koskinen","doi":"10.1007/s10845-024-02398-z","DOIUrl":"https://doi.org/10.1007/s10845-024-02398-z","url":null,"abstract":"<p>In various industry sectors, predicting the real-life availability of milling applications poses a significant challenge. This challenge arises from the need to prevent inefficient blade resource utilization and the risk of machine breakdowns due to natural wear. To ensure timely and accurate adjustments to milling processes based on the machine's cutting blade condition without disrupting ongoing production, we introduce the Fused Data Prediction Model (FDPM), a novel temporal hybrid prediction model. The FDPM combines the static and dynamic features of the machines to generate simulated outputs, including average cutting force, material removal rate, and peripheral milling machine torque. These outputs are correlated with real blade wear measurements, creating a simulation model that provides insights into predicting the wear progression in the machine when associated with real machine operational parameters. The FDPM also considers data preprocessing, reducing the dimensional space to an advanced recurrent neural network prediction algorithm for forecasting blade wear levels in milling. The validation of the physics-based simulation model indicates the highest fidelity in replicating wear progression with the average cutting force variable, demonstrating an average relative error of 2.38% when compared to the measured mean of rake wear during the milling cycle. These findings illustrate the effectiveness of the FDPM approach, showcasing an impressive prediction accuracy exceeding 93% when the model is trained with only 50% of the available data. These results highlight the potential of the FDPM model as a robust and versatile method for assessing wear levels in milling operations precisely, without disrupting ongoing production.</p>","PeriodicalId":16193,"journal":{"name":"Journal of Intelligent Manufacturing","volume":"48 20 1","pages":""},"PeriodicalIF":8.3,"publicationDate":"2024-05-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140928608","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Alberto Garcia-Perez, Maria Jose Gomez-Silva, Arturo de la Escalera-Hueso
{"title":"A Generative AI approach to improve in-situ vision tool wear monitoring with scarce data","authors":"Alberto Garcia-Perez, Maria Jose Gomez-Silva, Arturo de la Escalera-Hueso","doi":"10.1007/s10845-024-02379-2","DOIUrl":"https://doi.org/10.1007/s10845-024-02379-2","url":null,"abstract":"<p>Most aerospace turbine casings are mechanised using a vertical lathe. This paper presents a tool wear monitoring system using computer vision that analyses tool inserts once that the machining process has been completed. By installing a camera in the robot magazine room and a tool cleaning device to remove chips and cooling residuals, a neat tool image can be acquired. A subsequent Deep Learning (DL) model classifies the tool as acceptable or not, avoiding the drawbacks of alternative computer vision algorithms based on edges, dedicated features etc. Such model was trained with a significantly reduced number of images, in order to minimise the costly process to acquire and classify images during production. This could be achieved by introducing a special lens and some generative Artificial Intelligence (AI) models. This paper proposes two novel architectures: SCWGAN-GP, Scalable Condition Wasserstein Generative Adversarial Network (WGAN) with Gradient Penalty, and Focal Stable Diffusion (FSD) model, with class injection and dedicated loss function, to artificially increase the number of images to train the DL model. In addition, a K|Lens special optics was used to get multiple views of the vertical lathe inserts as a means of further increase data augmentation by hardware with a reduced number of production samples. Given an initial dataset, the classification accuracy was increased from 80.0 % up to 96.0 % using the FSD model. We also found that using as low as 100 real images, our methodology can achieve up to 93.3 % accuracy. Using only 100 original images for each insert type and wear condition results in 93.3 % accuracy and up to 94.6 % if 200 images are used. This accuracy is considered to be within human inspector uncertainty for this use-case. Fine-tuning the FSD model, with nearly 1 billion training parameters, showed superior performance compared to the SCWGAN-GP model, with only 80 million parameters, besides of requiring less training samples to produced higher quality output images. Furthermore, the visualization of the output activation mapping confirms that the model takes a decision on the right image features. Time to create the dataset was reduced from 3 months to 2 days using generative AI. So our approach enables to create industrial dataset with minimum effort and significant time speed-up compared with the conventional approach of acquiring a large number of images that DL models usually requires to avoid over-fitting. Despite the good results, this methodology is only applicable to relatively simple cases, such as our inserts where the images are not complex.</p>","PeriodicalId":16193,"journal":{"name":"Journal of Intelligent Manufacturing","volume":"2016 1","pages":""},"PeriodicalIF":8.3,"publicationDate":"2024-05-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140942491","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Lightweight convolutional neural network for fast visual perception of storage location status in stereo warehouse","authors":"Liangrui Zhang, Xi Zhang, Mingzhou Liu","doi":"10.1007/s10845-024-02397-0","DOIUrl":"https://doi.org/10.1007/s10845-024-02397-0","url":null,"abstract":"<p>Accurate storage location status data is an important input for location assignment in the inbound stage. Traditional Internet of Things (IoT) identification technologies require high costs and are easily affected by warehouse environments. A lightweight convolutional neural network is proposed for perceiving storage status to achieve high stability and low cost of location availability monitoring. Based on the existing You Only Look Once (YOLOv5) algorithm, the Hough transform is used in the pre-processing to implement tilt correction on the image to improve the stability of object localization. Then the feature extraction unit CBlock is designed based on a new depthwise separable convolution in which the convolutional block attention module is embedded, focusing on both channel and spatial information. The backbone network is constructed by stacking these CBlock blocks to compress the computational cost. The improved neck network adds cross-layer information fusion to reduce the information loss caused by sampling and ensure perceptual accuracy. Moreover, the penalty metric is redefined by SIoU, which considers the vector angle of the bounding box regression and improves the convergence speed and accuracy. The experiments show that the proposed model achieves successful results for storage location status perception in stereo warehouse.</p>","PeriodicalId":16193,"journal":{"name":"Journal of Intelligent Manufacturing","volume":"22 1","pages":""},"PeriodicalIF":8.3,"publicationDate":"2024-05-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140928607","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Warpage detection in 3D printing of polymer parts: a deep learning approach","authors":"Vivek V. Bhandarkar, Ashish Kumar, Puneet Tandon","doi":"10.1007/s10845-024-02414-2","DOIUrl":"https://doi.org/10.1007/s10845-024-02414-2","url":null,"abstract":"<p>While extrusion-based Additive Manufacturing (AM) facilitates the production of intricately shaped parts especially for polymer processing with customized geometries, the process’s diverse parameters often lead to various defects that significantly impact the quality and hence the mechanical properties of the manufactured parts. One prominent defect in polymer-based AM is warping, which can significantly compromise the quality of 3D-printed parts. In this work, a deep learning (DL) approach based on convolutional neural networks (CNN) was developed to automatically detect warpage defects in 3D-printed parts, subsequently leading to quality control of the 3D-printed parts. Experiments were conducted using a customized Delta 3D printer with acrylonitrile butadiene styrene (ABS) and polylactic acid (PLA) materials, following the ASTM D638 tensile specimen geometry and employing design of experiments (DoE) methodology. The CNN dataset was generated by autonomously capturing high-quality (HQ) images at regular intervals using a Raspberry Pi (RPi) setup, storing the timestamped images on Google Drive, and categorizing them into ‘warped’ and ‘unwarped’ classes based on user-defined criteria. The novelty of this research lies in creating a setup for gathering image-based datasets and deploying a DL-based CNN for the real-time identification of warpage defects in 3D printed parts made of ABS and PLA materials, achieving an outstanding accuracy rate of 99.43%. This research furnishes engineers and manufacturers with a step to bolster quality control in polymer-based AM, offering automated defect correction through feedback control. By enhancing the reliability and efficiency of AM processes, it empowers practitioners to achieve higher standards of production.</p>","PeriodicalId":16193,"journal":{"name":"Journal of Intelligent Manufacturing","volume":"19 1","pages":""},"PeriodicalIF":8.3,"publicationDate":"2024-05-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140928618","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Zeeshan Qaiser, Kunlin Yang, Rui Chen, Shane Johnson
{"title":"Variability-enhanced knowledge-based engineering (VEN) for reconfigurable molds","authors":"Zeeshan Qaiser, Kunlin Yang, Rui Chen, Shane Johnson","doi":"10.1007/s10845-024-02361-y","DOIUrl":"https://doi.org/10.1007/s10845-024-02361-y","url":null,"abstract":"<p>Mass production of high geometric variability surfaces, particularly in customized medical or ergonomic systems inherently display regions characterized by large variations in size, shape, and the spatial distribution. These high variability requirements result in low scalability, low production capacity, high complexity, and high maintenance and operational costs of manufacturing systems. Manufacturing molds need to physically emulate normal shapes with large variation while maintaining low complexity. A surface mold actuated with reconfigurable tooling (SMART) is proposed for molds with high variability capacity requirements for Custom Foot Orthoses (CFOs). The proposed <i>Variability Enhanced-KBE</i> (VEN) solution integrates a knowledge base of variations using statistical shape modeling (SSM), development of a parametric finite element (FE) model, a stepwise design optimization, and Machine Learning (ML) control. The experimentally validated FE model of the SMART system (RMSE < 0.5mm) is used for design optimization and dataset generation for the ML control algorithm. The fabricated SMART system employs discrete coarse and fine size/shape adjustment in low and high variation areas respectively. The SMART system’s experimental validation confirms an accuracy range of 0.3-0.5mm (RMSE) across the population, showing a 84% improvement over the benchmark. This VEN SMART approach may improve manufacturing in various high variability freeform surface applications.</p>","PeriodicalId":16193,"journal":{"name":"Journal of Intelligent Manufacturing","volume":"132 1","pages":""},"PeriodicalIF":8.3,"publicationDate":"2024-05-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140942312","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Remaining useful life prediction based on parallel multi-scale feature fusion network","authors":"Yuyan Yin, Jie Tian, Xinfeng Liu","doi":"10.1007/s10845-024-02399-y","DOIUrl":"https://doi.org/10.1007/s10845-024-02399-y","url":null,"abstract":"<p>In the domain of Predictive Health Management (PHM), the prediction of Remaining Useful Life (RUL) is pivotal for averting machinery malfunctions and curtailing maintenance expenditures. Currently, most RUL prediction methods overlook the correlation between local and global information, which may lead to the loss of important features and, consequently, a subsequent decline in predictive precision. To address these limitations, this study presents a groundbreaking deep learning framework termed the Parallel Multi-Scale Feature Fusion Network (PM2FN). This approach leverages the advantages of different network structures by constructing two distinct feature extractors to capture both global and local information, thereby providing a more comprehensive feature set for RUL prediction. Experimental results on two publicly available datasets and a real-world dataset demonstrate the superiority and effectiveness of our method, offering a promising solution for industrial RUL prediction.</p>","PeriodicalId":16193,"journal":{"name":"Journal of Intelligent Manufacturing","volume":"3 1","pages":""},"PeriodicalIF":8.3,"publicationDate":"2024-05-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140928483","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Continual learning for surface defect segmentation by subnetwork creation and selection","authors":"Aleksandr Dekhovich, Miguel A. Bessa","doi":"10.1007/s10845-024-02393-4","DOIUrl":"https://doi.org/10.1007/s10845-024-02393-4","url":null,"abstract":"<p>We introduce a new continual (or lifelong) learning algorithm called LDA-CP &S that performs segmentation tasks without undergoing catastrophic forgetting. The method is applied to two different surface defect segmentation problems that are learned incrementally, i.e., providing data about one type of defect at a time, while still being capable of predicting every defect that was seen previously. Our method creates a defect-related subnetwork for each defect type via iterative pruning and trains a classifier based on linear discriminant analysis (LDA). At the inference stage, we first predict the defect type with LDA and then predict the surface defects using the selected subnetwork. We compare our method with other continual learning methods showing a significant improvement – mean Intersection over Union better by a factor of two when compared to existing methods on both datasets. Importantly, our approach shows comparable results with joint training when all the training data (all defects) are seen simultaneously.</p>","PeriodicalId":16193,"journal":{"name":"Journal of Intelligent Manufacturing","volume":"20 1","pages":""},"PeriodicalIF":8.3,"publicationDate":"2024-05-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140886297","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Nicholas Satterlee, Runjian Jiang, Eugene Olevsky, Elisa Torresani, Xiaowei Zuo, John S. Kang
{"title":"Robust image-based cross-sectional grain boundary detection and characterization using machine learning","authors":"Nicholas Satterlee, Runjian Jiang, Eugene Olevsky, Elisa Torresani, Xiaowei Zuo, John S. Kang","doi":"10.1007/s10845-024-02383-6","DOIUrl":"https://doi.org/10.1007/s10845-024-02383-6","url":null,"abstract":"<p>Understanding the anisotropic sintering behavior of 3D-printed materials requires massive analytic studies on their grain boundary (GB) structures. Accurate characterization of the GBs is critical to study the metallurgical process. However, it is challenging and time-consuming for sintered 3D-printed materials due to immature etching and residual pores. In this study, we developed a machine learning-based method of characterizing GBs of sintered 3D-printed materials. The developed method is also generalizable and robust enough to characterize GBs from other non-3D-printed materials. This method can be applied to a small dataset because it includes a diffusion network that generate augmented images for training. The study compared various machine learning methods commonly used for segmentation, which include UNet, ResNeXt, and Ensemble of UNets. The comparison results showed that the Ensemble of UNets outperformed the other methods for the GB detection and characterization. The model is tested on unclear GBs from sintered 3D-printed samples processed with non-optimized etching and classifies the GBs with around 90% accuracy. The model is also tested on images with clear GBs from literature and classifies GBs with 92% accuracy.</p>","PeriodicalId":16193,"journal":{"name":"Journal of Intelligent Manufacturing","volume":"149 1","pages":""},"PeriodicalIF":8.3,"publicationDate":"2024-05-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140886559","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Ke Wu, Wei Xu, Qiming Shu, Wenjun Zhang, Xiaolong Cui, Jun Wu
{"title":"Unknown-class recognition adversarial network for open set domain adaptation fault diagnosis of rotating machinery","authors":"Ke Wu, Wei Xu, Qiming Shu, Wenjun Zhang, Xiaolong Cui, Jun Wu","doi":"10.1007/s10845-024-02395-2","DOIUrl":"https://doi.org/10.1007/s10845-024-02395-2","url":null,"abstract":"<p>Transfer learning methods have received abundant attention and extensively utilized in cross-domain fault diagnosis, which suppose that the label sets in the source and target domains are coincident. However, the open set domain adaptation problem which include new fault modes in the target domain is not well solved. To address the problem, an unknown-class recognition adversarial network (UCRAN) is proposed for the cross-domain fault diagnosis. Specifically, a three-dimensional discriminator is designed to conduct domain-invariant learning on the source domain, target known domain and target unknown domain. Then, an entropy minimization is introduced to determine the decision boundaries. Finally, a posteriori inference method is developed to calculate the open set recognition weight, which are used to adaptively weigh the importance between known class and unknown class. The effectiveness and practicability of the proposed UCRAN is validated by a series of experiments. The experimental results show that compared to other existing methods, the proposed UCRAN realizes better diagnosis performance in different domain transfer task.</p>","PeriodicalId":16193,"journal":{"name":"Journal of Intelligent Manufacturing","volume":"18 1","pages":""},"PeriodicalIF":8.3,"publicationDate":"2024-05-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140886380","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Mai Li, Ying Lin, Qianmei Feng, Wenjiang Fu, Shenglin Peng, Siwei Chen, Mahesh Paidpilli, Chirag Goel, Eduard Galstyan, Venkat Selvamanickam
{"title":"Quantile regression-enriched event modeling framework for dropout analysis in high-temperature superconductor manufacturing","authors":"Mai Li, Ying Lin, Qianmei Feng, Wenjiang Fu, Shenglin Peng, Siwei Chen, Mahesh Paidpilli, Chirag Goel, Eduard Galstyan, Venkat Selvamanickam","doi":"10.1007/s10845-024-02358-7","DOIUrl":"https://doi.org/10.1007/s10845-024-02358-7","url":null,"abstract":"<p>High-temperature superconductor (HTS) tapes have shown promising characteristics of high critical current, which are prerequisites for applications in high-field magnets. Due to the unstable growth conditions in the HTS manufacturing process, however, the frequent occurrences of dropouts in the critical current impede the consistent performance of HTS tapes. To manufacture HTS tapes with large scale, high yield, and uniform performance, it is essential to develop novel data analysis approaches for modeling the dropouts and identifying the related important process parameters. Conventional methods for modeling recurrent events, such as the point process, require the extraction of events from quality measurements. As the critical current is a continuous process, it may not comprehensively represent the drop patterns by transforming the time-series measurements into a set of events. To solve this issue, we develop a novel quantile regression-enriched event modeling (QREM) framework that integrates the non-homogeneous Poisson process for modeling the occurrence of dropouts and the quantile regression for capturing the drop patterns. By incorporating the feature selection and regularization, the proposed framework identifies a set of significant process parameters that can potentially cause the dropouts of HTS tapes. The proposed method is tested on real HTS tapes produced using an advanced manufacturing process, successfully identifying important parameters that influence dropout events including the substrate temperature and voltage. The results demonstrate that the proposed QREM method outperforms the standard point process in predicting the occurrence of dropouts.</p>","PeriodicalId":16193,"journal":{"name":"Journal of Intelligent Manufacturing","volume":"8 1","pages":""},"PeriodicalIF":8.3,"publicationDate":"2024-05-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140886303","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}