{"title":"蜂花相互作用的三维轨迹提取及花结构耦合方法","authors":"Ying Leng , Shuaiqi Feng , Ziyi Zhong , Sheng Wu , Minkun Guo , Weiliang Wen , Jian Xu","doi":"10.1016/j.compag.2025.110867","DOIUrl":null,"url":null,"abstract":"<div><div>Understanding the spatial relationship between bee flower-visiting trajectories and floral structures is critical for elucidating pollination guidance mechanisms and improving pollination strategies. However, existing studies generally lack systematic analysis of the dynamic coupling between flight trajectories and floral structures, largely due to limitations in high-precision trajectory acquisition and structural reconstruction. This study proposes a multimodal perception framework that combines micro-computed tomography (μ-CT), ultraviolet (UV) imaging, and RGB-D vision to build a high-resolution 3D bee flight trajectory tracking system. A convolutional block attention module (CBAM) is incorporated into the YOLOv8 network to detect bee head and body positions, achieving a mean trajectory reconstruction error of 0.49 mm. UV images are used to generate pseudo-3D point clouds of nectar guide graphs, which are spatially registered with floral models reconstructed via μ-CT scanning. Multi-source point cloud alignment is performed using principal component analysis (PCA), random sample consensus (RANSAC), and iterative closest point (ICP) algorithms. Trajectory analysis shows that approximately 50.3 % of bee stopping points are concentrated in reproductive organs, highlighting these as core regions of interaction. Nectar guides accounted for 20.3 % of stopping points, suggesting their role in spatial navigation and localization. To the best of our knowledge, this is the first study to achieve high-resolution fusion of bee flight trajectories with detailed floral structures. The proposed approach provides a technical basis for modeling the integrated process of perception, decision-making, behavioral adjustment, and structural interaction, while offering data support for precision pollination strategies and the development of bioinspired systems.</div></div>","PeriodicalId":50627,"journal":{"name":"Computers and Electronics in Agriculture","volume":"238 ","pages":"Article 110867"},"PeriodicalIF":8.9000,"publicationDate":"2025-08-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Three-dimensional trajectory extraction and flower structure coupling method for bee-flower interactions\",\"authors\":\"Ying Leng , Shuaiqi Feng , Ziyi Zhong , Sheng Wu , Minkun Guo , Weiliang Wen , Jian Xu\",\"doi\":\"10.1016/j.compag.2025.110867\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Understanding the spatial relationship between bee flower-visiting trajectories and floral structures is critical for elucidating pollination guidance mechanisms and improving pollination strategies. However, existing studies generally lack systematic analysis of the dynamic coupling between flight trajectories and floral structures, largely due to limitations in high-precision trajectory acquisition and structural reconstruction. This study proposes a multimodal perception framework that combines micro-computed tomography (μ-CT), ultraviolet (UV) imaging, and RGB-D vision to build a high-resolution 3D bee flight trajectory tracking system. A convolutional block attention module (CBAM) is incorporated into the YOLOv8 network to detect bee head and body positions, achieving a mean trajectory reconstruction error of 0.49 mm. UV images are used to generate pseudo-3D point clouds of nectar guide graphs, which are spatially registered with floral models reconstructed via μ-CT scanning. Multi-source point cloud alignment is performed using principal component analysis (PCA), random sample consensus (RANSAC), and iterative closest point (ICP) algorithms. Trajectory analysis shows that approximately 50.3 % of bee stopping points are concentrated in reproductive organs, highlighting these as core regions of interaction. Nectar guides accounted for 20.3 % of stopping points, suggesting their role in spatial navigation and localization. To the best of our knowledge, this is the first study to achieve high-resolution fusion of bee flight trajectories with detailed floral structures. The proposed approach provides a technical basis for modeling the integrated process of perception, decision-making, behavioral adjustment, and structural interaction, while offering data support for precision pollination strategies and the development of bioinspired systems.</div></div>\",\"PeriodicalId\":50627,\"journal\":{\"name\":\"Computers and Electronics in Agriculture\",\"volume\":\"238 \",\"pages\":\"Article 110867\"},\"PeriodicalIF\":8.9000,\"publicationDate\":\"2025-08-11\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Computers and Electronics in Agriculture\",\"FirstCategoryId\":\"97\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0168169925009731\",\"RegionNum\":1,\"RegionCategory\":\"农林科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"AGRICULTURE, MULTIDISCIPLINARY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computers and Electronics in Agriculture","FirstCategoryId":"97","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0168169925009731","RegionNum":1,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AGRICULTURE, MULTIDISCIPLINARY","Score":null,"Total":0}
Three-dimensional trajectory extraction and flower structure coupling method for bee-flower interactions
Understanding the spatial relationship between bee flower-visiting trajectories and floral structures is critical for elucidating pollination guidance mechanisms and improving pollination strategies. However, existing studies generally lack systematic analysis of the dynamic coupling between flight trajectories and floral structures, largely due to limitations in high-precision trajectory acquisition and structural reconstruction. This study proposes a multimodal perception framework that combines micro-computed tomography (μ-CT), ultraviolet (UV) imaging, and RGB-D vision to build a high-resolution 3D bee flight trajectory tracking system. A convolutional block attention module (CBAM) is incorporated into the YOLOv8 network to detect bee head and body positions, achieving a mean trajectory reconstruction error of 0.49 mm. UV images are used to generate pseudo-3D point clouds of nectar guide graphs, which are spatially registered with floral models reconstructed via μ-CT scanning. Multi-source point cloud alignment is performed using principal component analysis (PCA), random sample consensus (RANSAC), and iterative closest point (ICP) algorithms. Trajectory analysis shows that approximately 50.3 % of bee stopping points are concentrated in reproductive organs, highlighting these as core regions of interaction. Nectar guides accounted for 20.3 % of stopping points, suggesting their role in spatial navigation and localization. To the best of our knowledge, this is the first study to achieve high-resolution fusion of bee flight trajectories with detailed floral structures. The proposed approach provides a technical basis for modeling the integrated process of perception, decision-making, behavioral adjustment, and structural interaction, while offering data support for precision pollination strategies and the development of bioinspired systems.
期刊介绍:
Computers and Electronics in Agriculture provides international coverage of advancements in computer hardware, software, electronic instrumentation, and control systems applied to agricultural challenges. Encompassing agronomy, horticulture, forestry, aquaculture, and animal farming, the journal publishes original papers, reviews, and applications notes. It explores the use of computers and electronics in plant or animal agricultural production, covering topics like agricultural soils, water, pests, controlled environments, and waste. The scope extends to on-farm post-harvest operations and relevant technologies, including artificial intelligence, sensors, machine vision, robotics, networking, and simulation modeling. Its companion journal, Smart Agricultural Technology, continues the focus on smart applications in production agriculture.