蜂花相互作用的三维轨迹提取及花结构耦合方法

IF 8.9 1区 农林科学 Q1 AGRICULTURE, MULTIDISCIPLINARY
Ying Leng , Shuaiqi Feng , Ziyi Zhong , Sheng Wu , Minkun Guo , Weiliang Wen , Jian Xu
{"title":"蜂花相互作用的三维轨迹提取及花结构耦合方法","authors":"Ying Leng ,&nbsp;Shuaiqi Feng ,&nbsp;Ziyi Zhong ,&nbsp;Sheng Wu ,&nbsp;Minkun Guo ,&nbsp;Weiliang Wen ,&nbsp;Jian Xu","doi":"10.1016/j.compag.2025.110867","DOIUrl":null,"url":null,"abstract":"<div><div>Understanding the spatial relationship between bee flower-visiting trajectories and floral structures is critical for elucidating pollination guidance mechanisms and improving pollination strategies. However, existing studies generally lack systematic analysis of the dynamic coupling between flight trajectories and floral structures, largely due to limitations in high-precision trajectory acquisition and structural reconstruction. This study proposes a multimodal perception framework that combines micro-computed tomography (μ-CT), ultraviolet (UV) imaging, and RGB-D vision to build a high-resolution 3D bee flight trajectory tracking system. A convolutional block attention module (CBAM) is incorporated into the YOLOv8 network to detect bee head and body positions, achieving a mean trajectory reconstruction error of 0.49  mm. UV images are used to generate pseudo-3D point clouds of nectar guide graphs, which are spatially registered with floral models reconstructed via μ-CT scanning. Multi-source point cloud alignment is performed using principal component analysis (PCA), random sample consensus (RANSAC), and iterative closest point (ICP) algorithms. Trajectory analysis shows that approximately 50.3 % of bee stopping points are concentrated in reproductive organs, highlighting these as core regions of interaction. Nectar guides accounted for 20.3 % of stopping points, suggesting their role in spatial navigation and localization. To the best of our knowledge, this is the first study to achieve high-resolution fusion of bee flight trajectories with detailed floral structures. The proposed approach provides a technical basis for modeling the integrated process of perception, decision-making, behavioral adjustment, and structural interaction, while offering data support for precision pollination strategies and the development of bioinspired systems.</div></div>","PeriodicalId":50627,"journal":{"name":"Computers and Electronics in Agriculture","volume":"238 ","pages":"Article 110867"},"PeriodicalIF":8.9000,"publicationDate":"2025-08-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Three-dimensional trajectory extraction and flower structure coupling method for bee-flower interactions\",\"authors\":\"Ying Leng ,&nbsp;Shuaiqi Feng ,&nbsp;Ziyi Zhong ,&nbsp;Sheng Wu ,&nbsp;Minkun Guo ,&nbsp;Weiliang Wen ,&nbsp;Jian Xu\",\"doi\":\"10.1016/j.compag.2025.110867\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Understanding the spatial relationship between bee flower-visiting trajectories and floral structures is critical for elucidating pollination guidance mechanisms and improving pollination strategies. However, existing studies generally lack systematic analysis of the dynamic coupling between flight trajectories and floral structures, largely due to limitations in high-precision trajectory acquisition and structural reconstruction. This study proposes a multimodal perception framework that combines micro-computed tomography (μ-CT), ultraviolet (UV) imaging, and RGB-D vision to build a high-resolution 3D bee flight trajectory tracking system. A convolutional block attention module (CBAM) is incorporated into the YOLOv8 network to detect bee head and body positions, achieving a mean trajectory reconstruction error of 0.49  mm. UV images are used to generate pseudo-3D point clouds of nectar guide graphs, which are spatially registered with floral models reconstructed via μ-CT scanning. Multi-source point cloud alignment is performed using principal component analysis (PCA), random sample consensus (RANSAC), and iterative closest point (ICP) algorithms. Trajectory analysis shows that approximately 50.3 % of bee stopping points are concentrated in reproductive organs, highlighting these as core regions of interaction. Nectar guides accounted for 20.3 % of stopping points, suggesting their role in spatial navigation and localization. To the best of our knowledge, this is the first study to achieve high-resolution fusion of bee flight trajectories with detailed floral structures. The proposed approach provides a technical basis for modeling the integrated process of perception, decision-making, behavioral adjustment, and structural interaction, while offering data support for precision pollination strategies and the development of bioinspired systems.</div></div>\",\"PeriodicalId\":50627,\"journal\":{\"name\":\"Computers and Electronics in Agriculture\",\"volume\":\"238 \",\"pages\":\"Article 110867\"},\"PeriodicalIF\":8.9000,\"publicationDate\":\"2025-08-11\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Computers and Electronics in Agriculture\",\"FirstCategoryId\":\"97\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0168169925009731\",\"RegionNum\":1,\"RegionCategory\":\"农林科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"AGRICULTURE, MULTIDISCIPLINARY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computers and Electronics in Agriculture","FirstCategoryId":"97","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0168169925009731","RegionNum":1,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AGRICULTURE, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0

摘要

了解蜜蜂访花轨迹与花结构之间的空间关系对阐明传粉指导机制和改进传粉策略具有重要意义。然而,现有研究普遍缺乏对飞行轨迹与花结构之间动态耦合的系统分析,这主要是由于高精度轨迹获取和结构重建的限制。本研究提出了一种结合微计算机断层扫描(μ-CT)、紫外线成像(UV)和RGB-D视觉的多模态感知框架,以构建高分辨率的3D蜜蜂飞行轨迹跟踪系统。在YOLOv8网络中加入了卷积块注意模块(CBAM)来检测蜜蜂头部和身体的位置,平均轨迹重建误差为0.49 mm。利用UV图像生成花蜜导图的伪三维点云,并与μ-CT扫描重建的花模型进行空间配准。使用主成分分析(PCA)、随机样本一致性(RANSAC)和迭代最近点(ICP)算法执行多源点云对齐。轨迹分析表明,大约50.3%的蜜蜂停止点集中在生殖器官,突出表明这些是相互作用的核心区域。花蜜向导占停止点的20.3%,表明它们在空间导航和定位中的作用。据我们所知,这是第一次实现蜜蜂飞行轨迹与详细花朵结构的高分辨率融合的研究。该方法为感知、决策、行为调整和结构相互作用的综合过程建模提供了技术基础,同时为精确授粉策略和生物激励系统的开发提供了数据支持。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Three-dimensional trajectory extraction and flower structure coupling method for bee-flower interactions
Understanding the spatial relationship between bee flower-visiting trajectories and floral structures is critical for elucidating pollination guidance mechanisms and improving pollination strategies. However, existing studies generally lack systematic analysis of the dynamic coupling between flight trajectories and floral structures, largely due to limitations in high-precision trajectory acquisition and structural reconstruction. This study proposes a multimodal perception framework that combines micro-computed tomography (μ-CT), ultraviolet (UV) imaging, and RGB-D vision to build a high-resolution 3D bee flight trajectory tracking system. A convolutional block attention module (CBAM) is incorporated into the YOLOv8 network to detect bee head and body positions, achieving a mean trajectory reconstruction error of 0.49  mm. UV images are used to generate pseudo-3D point clouds of nectar guide graphs, which are spatially registered with floral models reconstructed via μ-CT scanning. Multi-source point cloud alignment is performed using principal component analysis (PCA), random sample consensus (RANSAC), and iterative closest point (ICP) algorithms. Trajectory analysis shows that approximately 50.3 % of bee stopping points are concentrated in reproductive organs, highlighting these as core regions of interaction. Nectar guides accounted for 20.3 % of stopping points, suggesting their role in spatial navigation and localization. To the best of our knowledge, this is the first study to achieve high-resolution fusion of bee flight trajectories with detailed floral structures. The proposed approach provides a technical basis for modeling the integrated process of perception, decision-making, behavioral adjustment, and structural interaction, while offering data support for precision pollination strategies and the development of bioinspired systems.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Computers and Electronics in Agriculture
Computers and Electronics in Agriculture 工程技术-计算机:跨学科应用
CiteScore
15.30
自引率
14.50%
发文量
800
审稿时长
62 days
期刊介绍: Computers and Electronics in Agriculture provides international coverage of advancements in computer hardware, software, electronic instrumentation, and control systems applied to agricultural challenges. Encompassing agronomy, horticulture, forestry, aquaculture, and animal farming, the journal publishes original papers, reviews, and applications notes. It explores the use of computers and electronics in plant or animal agricultural production, covering topics like agricultural soils, water, pests, controlled environments, and waste. The scope extends to on-farm post-harvest operations and relevant technologies, including artificial intelligence, sensors, machine vision, robotics, networking, and simulation modeling. Its companion journal, Smart Agricultural Technology, continues the focus on smart applications in production agriculture.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信