{"title":"基于传感器的深度学习和基于视觉的图像处理的工具状态监测双方法","authors":"Ahmed Abdeltawab , Zhang Xi , Zhang Longjia","doi":"10.1016/j.jmapro.2025.06.041","DOIUrl":null,"url":null,"abstract":"<div><div>Accurate decision-making in cutting tool condition monitoring (TCM) is critical for maintaining efficiency and quality in modern manufacturing systems. Traditional detection methods, whether direct or indirect, face several limitations and challenges. Different cutting parameters and signal noise often affect indirect methods, which can compromise reliability. Direct approaches for TCM frequently require physically measuring tool wear by visually monitoring it after cutting a sufficient distance on the workpiece. This requires the process to be interrupted for tool condition evaluation, therefore negatively impacting production efficiency. This study proposes a novel dual-check approach that combines sensor-based and vision-based techniques to enhance tool condition monitoring. First, the indirect method utilizes scalogram images derived from acoustic emission signals, analyzed using transfer learning models, including pre-trained networks such as GoogLeNet, SqueezeNet, VGG19, ShuffleNet, and ResNet50. Despite achieving up to 70 % accuracy under certain conditions, sensor signal noise reduced identification accuracy to below 50 %. A direct vision-based method is introduced to address these limitations, using projected rotating tool images to capture individual cutting teeth for a more accurate assessment of tool conditions. This integrated approach improves tool condition identification by combining the strengths of both methods, enhancing overall accuracy and reliability. The study demonstrates the potential of Industry 4.0 technologies, such as advanced imaging and CNC control systems, to revolutionize manufacturing processes by increasing efficiency and ensuring high-quality automated production.</div></div>","PeriodicalId":16148,"journal":{"name":"Journal of Manufacturing Processes","volume":"150 ","pages":"Pages 224-256"},"PeriodicalIF":6.1000,"publicationDate":"2025-06-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Dual-method approach of tool condition monitoring using sensor-based deep learning with vision-based image processing\",\"authors\":\"Ahmed Abdeltawab , Zhang Xi , Zhang Longjia\",\"doi\":\"10.1016/j.jmapro.2025.06.041\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Accurate decision-making in cutting tool condition monitoring (TCM) is critical for maintaining efficiency and quality in modern manufacturing systems. Traditional detection methods, whether direct or indirect, face several limitations and challenges. Different cutting parameters and signal noise often affect indirect methods, which can compromise reliability. Direct approaches for TCM frequently require physically measuring tool wear by visually monitoring it after cutting a sufficient distance on the workpiece. This requires the process to be interrupted for tool condition evaluation, therefore negatively impacting production efficiency. This study proposes a novel dual-check approach that combines sensor-based and vision-based techniques to enhance tool condition monitoring. First, the indirect method utilizes scalogram images derived from acoustic emission signals, analyzed using transfer learning models, including pre-trained networks such as GoogLeNet, SqueezeNet, VGG19, ShuffleNet, and ResNet50. Despite achieving up to 70 % accuracy under certain conditions, sensor signal noise reduced identification accuracy to below 50 %. A direct vision-based method is introduced to address these limitations, using projected rotating tool images to capture individual cutting teeth for a more accurate assessment of tool conditions. This integrated approach improves tool condition identification by combining the strengths of both methods, enhancing overall accuracy and reliability. The study demonstrates the potential of Industry 4.0 technologies, such as advanced imaging and CNC control systems, to revolutionize manufacturing processes by increasing efficiency and ensuring high-quality automated production.</div></div>\",\"PeriodicalId\":16148,\"journal\":{\"name\":\"Journal of Manufacturing Processes\",\"volume\":\"150 \",\"pages\":\"Pages 224-256\"},\"PeriodicalIF\":6.1000,\"publicationDate\":\"2025-06-20\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Manufacturing Processes\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S1526612525006991\",\"RegionNum\":1,\"RegionCategory\":\"工程技术\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"ENGINEERING, MANUFACTURING\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Manufacturing Processes","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1526612525006991","RegionNum":1,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, MANUFACTURING","Score":null,"Total":0}
Dual-method approach of tool condition monitoring using sensor-based deep learning with vision-based image processing
Accurate decision-making in cutting tool condition monitoring (TCM) is critical for maintaining efficiency and quality in modern manufacturing systems. Traditional detection methods, whether direct or indirect, face several limitations and challenges. Different cutting parameters and signal noise often affect indirect methods, which can compromise reliability. Direct approaches for TCM frequently require physically measuring tool wear by visually monitoring it after cutting a sufficient distance on the workpiece. This requires the process to be interrupted for tool condition evaluation, therefore negatively impacting production efficiency. This study proposes a novel dual-check approach that combines sensor-based and vision-based techniques to enhance tool condition monitoring. First, the indirect method utilizes scalogram images derived from acoustic emission signals, analyzed using transfer learning models, including pre-trained networks such as GoogLeNet, SqueezeNet, VGG19, ShuffleNet, and ResNet50. Despite achieving up to 70 % accuracy under certain conditions, sensor signal noise reduced identification accuracy to below 50 %. A direct vision-based method is introduced to address these limitations, using projected rotating tool images to capture individual cutting teeth for a more accurate assessment of tool conditions. This integrated approach improves tool condition identification by combining the strengths of both methods, enhancing overall accuracy and reliability. The study demonstrates the potential of Industry 4.0 technologies, such as advanced imaging and CNC control systems, to revolutionize manufacturing processes by increasing efficiency and ensuring high-quality automated production.
期刊介绍:
The aim of the Journal of Manufacturing Processes (JMP) is to exchange current and future directions of manufacturing processes research, development and implementation, and to publish archival scholarly literature with a view to advancing state-of-the-art manufacturing processes and encouraging innovation for developing new and efficient processes. The journal will also publish from other research communities for rapid communication of innovative new concepts. Special-topic issues on emerging technologies and invited papers will also be published.