Farah Saeed , Chenjiao Tan , Tianming Liu , Changying Li
{"title":"3D neural architecture search to optimize segmentation of plant parts","authors":"Farah Saeed , Chenjiao Tan , Tianming Liu , Changying Li","doi":"10.1016/j.atech.2025.100776","DOIUrl":null,"url":null,"abstract":"<div><div>Accurately segmenting plant parts from imagery is vital for improving crop phenotypic traits. However, current 3D deep learning models for segmentation in point cloud data require specific network architectures that are usually manually designed, which is both tedious and suboptimal. To overcome this issue, a 3D neural architecture search (NAS) was performed in this study to optimize cotton plant part segmentation. The search space was designed using Point Voxel Convolution (PVConv) as the basic building block of the network. The NAS framework included a supernetwork with weight sharing and an evolutionary search to find optimal candidates, with three surrogate learners to predict mean IoU, latency, and memory footprint. The optimal candidate searched from the proposed method consisted of five PVConv layers with either 32 or 512 output channels, achieving mean IoU and accuracy of over 90 % and 96 %, respectively, and outperforming manually designed architectures. Additionally, the evolutionary search was updated to search for architectures satisfying memory and time constraints, with searched architectures achieving mean IoU and accuracy of >84 % and 94 %, respectively. Furthermore, a differentiable architecture search (DARTS) utilizing PVConv operation was implemented for comparison, and our method demonstrated better segmentation performance with a margin of >2 % and 1 % in mean IoU and accuracy, respectively. Overall, the proposed method can be applied to segment cotton plants with an accuracy over 94 %, while adjusting to available resource constraints.</div></div>","PeriodicalId":74813,"journal":{"name":"Smart agricultural technology","volume":"10 ","pages":"Article 100776"},"PeriodicalIF":6.3000,"publicationDate":"2025-01-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Smart agricultural technology","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2772375525000103","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AGRICULTURAL ENGINEERING","Score":null,"Total":0}
引用次数: 0
Abstract
Accurately segmenting plant parts from imagery is vital for improving crop phenotypic traits. However, current 3D deep learning models for segmentation in point cloud data require specific network architectures that are usually manually designed, which is both tedious and suboptimal. To overcome this issue, a 3D neural architecture search (NAS) was performed in this study to optimize cotton plant part segmentation. The search space was designed using Point Voxel Convolution (PVConv) as the basic building block of the network. The NAS framework included a supernetwork with weight sharing and an evolutionary search to find optimal candidates, with three surrogate learners to predict mean IoU, latency, and memory footprint. The optimal candidate searched from the proposed method consisted of five PVConv layers with either 32 or 512 output channels, achieving mean IoU and accuracy of over 90 % and 96 %, respectively, and outperforming manually designed architectures. Additionally, the evolutionary search was updated to search for architectures satisfying memory and time constraints, with searched architectures achieving mean IoU and accuracy of >84 % and 94 %, respectively. Furthermore, a differentiable architecture search (DARTS) utilizing PVConv operation was implemented for comparison, and our method demonstrated better segmentation performance with a margin of >2 % and 1 % in mean IoU and accuracy, respectively. Overall, the proposed method can be applied to segment cotton plants with an accuracy over 94 %, while adjusting to available resource constraints.