Smart agricultural technology最新文献

筛选
英文 中文
AI-powered cow detection in complex farm environments
IF 6.3
Smart agricultural technology Pub Date : 2025-01-08 DOI: 10.1016/j.atech.2025.100770
Voncarlos M. Araújo , Ines Rili , Thomas Gisiger , Sébastien Gambs , Elsa Vasseur , Marjorie Cellier , Abdoulaye Baniré Diallo
{"title":"AI-powered cow detection in complex farm environments","authors":"Voncarlos M. Araújo ,&nbsp;Ines Rili ,&nbsp;Thomas Gisiger ,&nbsp;Sébastien Gambs ,&nbsp;Elsa Vasseur ,&nbsp;Marjorie Cellier ,&nbsp;Abdoulaye Baniré Diallo","doi":"10.1016/j.atech.2025.100770","DOIUrl":"10.1016/j.atech.2025.100770","url":null,"abstract":"<div><div>Animal welfare has become a critical issue in contemporary society, emphasizing our ethical responsibilities toward animals, particularly within livestock farming. In addition, the advent of Artificial Intelligence (AI) technologies, specifically computer vision, offers a innovative approach to monitoring and enhancing animal welfare. Cows, as essential contributors to sustainable agriculture and climate management, being a central part of it. However, existing cow detection algorithms face significant challenges in real-world farming environments, such as complex lighting, occlusions, pose variations and background interference, which hinder accurate and reliable detection. Additionally, the model generalization power is highly desirable as it enables the model to adapt and perform well across different contexts and conditions, beyond its training environment or dataset. This study addresses these challenges in diverse cow dataset composed of six different environments, including indoor and outdoor scenarios. More precisely, we propose a novel detection model that combines YOLOv8 with the CBAM (Convolutional Block Attention Module) and assess its performance against baseline models, including Mask R-CNN, YOLOv5 and YOLOv8. Our findings indicate that while baseline models show promise, their performance degrades in complex real-world conditions, which our approach improves using the CBAM attention module. Overall, YOLOv8-CBAM outperformed YOLOv8 by 2.3% in mAP across all camera types, achieving a precision of 95.2% and an [email protected]:0.95 of 82.6%, demonstrating superior generalization and enhanced detection accuracy in complex backgrounds. Thus, the primary contributions of this research are: (1) providing an in-depth analysis of current limitations in cow detection under challenging indoor and outdoor environments, (2) proposing a robust general model that effectively detects cows in complex real-world conditions and (3) evaluating and benchmarking state-of-the-art detection algorithms. Potential application scenarios of the model include automated health monitoring, behavioral analysis and tracking within smart farm management systems, enabling precise detection of individual cows, even in challenging environments. By addressing these critical challenges, this study paves the way for future innovations in AI-driven livestock monitoring, aiming to improve the welfare and management of farm animals while advancing smart agriculture.</div></div>","PeriodicalId":74813,"journal":{"name":"Smart agricultural technology","volume":"10 ","pages":"Article 100770"},"PeriodicalIF":6.3,"publicationDate":"2025-01-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143181509","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Multi-task neural networks for multi-step soil moisture forecasting in vineyards using Internet-of-Things sensors 利用物联网传感器的多任务神经网络进行葡萄园多步骤土壤湿度预报
IF 6.3
Smart agricultural technology Pub Date : 2025-01-08 DOI: 10.1016/j.atech.2025.100769
Ada Baldi , Laura Carnevali , Giovanni Collodi , Marco Lippi , Antonio Manes
{"title":"Multi-task neural networks for multi-step soil moisture forecasting in vineyards using Internet-of-Things sensors","authors":"Ada Baldi ,&nbsp;Laura Carnevali ,&nbsp;Giovanni Collodi ,&nbsp;Marco Lippi ,&nbsp;Antonio Manes","doi":"10.1016/j.atech.2025.100769","DOIUrl":"10.1016/j.atech.2025.100769","url":null,"abstract":"<div><div>Promoting an efficient management of water resources is one of the most crucial challenges in smart farming for the coming years. In this context, developing accurate soil moisture forecasting methods is fundamental in order to optimize irrigation and avoid waste. In this paper, we present a deep learning approach based on the multi-task paradigm, which is exploited to jointly forecast soil moisture at multiple time steps in the future, using a multivariate time-series as input features. Experiments are conducted on a real data set collected via data fusion techniques from Internet-of-Things (IoT) sensors located in a vineyard in Montalcino (Tuscany), showing the advantages of joint multi-step forecasting for prediction horizons that range from 24 to 48 hours ahead.</div></div>","PeriodicalId":74813,"journal":{"name":"Smart agricultural technology","volume":"10 ","pages":"Article 100769"},"PeriodicalIF":6.3,"publicationDate":"2025-01-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143181954","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Data generation using Pix2Pix to improve YOLO v8 performance in UAV-based Yuzu detection
IF 6.3
Smart agricultural technology Pub Date : 2025-01-07 DOI: 10.1016/j.atech.2025.100777
Zhen Zhang , Yuu Tanimoto , Makoto Iwata , Shinichi Yoshida
{"title":"Data generation using Pix2Pix to improve YOLO v8 performance in UAV-based Yuzu detection","authors":"Zhen Zhang ,&nbsp;Yuu Tanimoto ,&nbsp;Makoto Iwata ,&nbsp;Shinichi Yoshida","doi":"10.1016/j.atech.2025.100777","DOIUrl":"10.1016/j.atech.2025.100777","url":null,"abstract":"<div><div>Unmanned aerial vehicle (UAV) detection using deep learning techniques plays a crucial role in the pre-harvest estimation of yuzu (Citrus Junos) yield. However, the detection performance of deep learning models heavily depends on the quantity and quality of training data. One of the current challenges is that the work of labeling data is difficult and expensive, because of the high density of fruits, the similarity in color between fruits and leaves, and the varying lighting conditions in the captured images of fruit trees. To address these challenges, we propose to use generative adversarial networks (GANs) for data generation, and then utilize the generated data to improve the yuzu detection performance of YOLO (You Only Look Once) v8 models.</div><div>In this study, the experimental images were photographed using UAVs from two orchards of Kochi agricultural research center between 2020 and 2022. In our approach, we first trained a conditional GAN called Pix2Pix using pairs of images, where the training inputs are the images of fruit trees with all fruits removed, and the training targets are the original images. Subsequently, we created new regions of interest on the images of fruit trees and used the trained Pix2Pix network to generate yuzu fruits within these regions, thereby generating new labeled images. In the experiments, we merged real and generated images to train YOLO v8-series models and explored to reduce the dependency on real training images through the proposed data augmentation approach.</div><div>The results showed that the combined training of these generated and real images can significantly improve the detection performance of YOLO v8-series models, with the maximum improvements of 5.4% in F1-scores, 5.6% in mAP50, and 7.1% in mAP50–90, respectively. Moreover, the proposed data augmentation approach allowed for up to a 50% reduction in the amount of real training images while still achieving improved detection results.</div></div>","PeriodicalId":74813,"journal":{"name":"Smart agricultural technology","volume":"10 ","pages":"Article 100777"},"PeriodicalIF":6.3,"publicationDate":"2025-01-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143181508","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Design and experiment of active obstacle avoidance control system for grapevine interplant weeding based on GNSS 基于全球导航卫星系统的葡萄植株间除草主动避障控制系统的设计与实验
IF 6.3
Smart agricultural technology Pub Date : 2025-01-07 DOI: 10.1016/j.atech.2025.100781
Hao Zhang, Zejiang Meng, Shiwei Wen, Guangyao Liu, Guangrui Hu, Jun Chen, Shuo Zhang
{"title":"Design and experiment of active obstacle avoidance control system for grapevine interplant weeding based on GNSS","authors":"Hao Zhang,&nbsp;Zejiang Meng,&nbsp;Shiwei Wen,&nbsp;Guangyao Liu,&nbsp;Guangrui Hu,&nbsp;Jun Chen,&nbsp;Shuo Zhang","doi":"10.1016/j.atech.2025.100781","DOIUrl":"10.1016/j.atech.2025.100781","url":null,"abstract":"<div><div>Traditional passive obstacle avoidance mechanical weeding strategies heavily relied on touch rods, which led to a high crop damage rate and low weeding efficiency during operations. This study proposed an obstacle avoidance information collection scheme that integrates precise detection of obstacle positions and coordinate conversion of weeding tool positions. An active obstacle avoidance control system based on obstacle positions and real-time tool status was designed. This system consisted of the autonomous navigation equipment, obstacle avoidance information collection units, the control system module, hydraulic execution components, and the real-time monitoring sensor. Based on the requirements for active obstacle avoidance, the study established the relationship between the obstacle avoidance information collection units, hydraulic execution components, and the real-time monitoring sensor, and determined a precise active obstacle avoidance control scheme. Field tests were conducted using machine forward speed as the test factor, with inter-row weeding coverage rate and plant damage rate as evaluation indicators. The test results indicated that when the machine forward speed was 460 mm/s, the combined effect of inter-row weeding coverage and operational efficiency was optimal, with an average inter-row weeding coverage rate of 94.62 % and a plant damage rate of 1.94 %. The active obstacle avoidance weeding scheme proposed in this study provided a technical reference for improving inter-row weeding effectiveness in orchards.</div></div>","PeriodicalId":74813,"journal":{"name":"Smart agricultural technology","volume":"10 ","pages":"Article 100781"},"PeriodicalIF":6.3,"publicationDate":"2025-01-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143181951","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
IoT-enabled solar-powered smart irrigation for precision agriculture
IF 6.3
Smart agricultural technology Pub Date : 2025-01-07 DOI: 10.1016/j.atech.2025.100773
Md. Rasel Al Mamun, Abu Kawsar Ahmed, Sidratul Muntaha Upoma, Md.Mashurul Haque, Muhammad Ashik-E-Rabbani
{"title":"IoT-enabled solar-powered smart irrigation for precision agriculture","authors":"Md. Rasel Al Mamun,&nbsp;Abu Kawsar Ahmed,&nbsp;Sidratul Muntaha Upoma,&nbsp;Md.Mashurul Haque,&nbsp;Muhammad Ashik-E-Rabbani","doi":"10.1016/j.atech.2025.100773","DOIUrl":"10.1016/j.atech.2025.100773","url":null,"abstract":"<div><div>The Internet of Things (IoT) can enable the fourth industrial revolution, significantly boosting production and efficiency in the agricultural sector by optimizing farming practices. This research aims to develop a solar-powered IoT irrigating system. The system comprised a 20W solar panel for powering the base station, a Raspberry Pi 4 for pump control, and a 12V 7.5Ah battery for energy storage. Multiple data-collection substations were established to gather field data. The ESP8266 microcontroller was integrated with a Capacitive Soil Moisture Sensor (V1.2) and a DHT 22 sensor to relay soil moisture, air temperature, and humidity data to the base station via the Message Queuing Telemetry Transport (MQTT) protocol. The battery can power the motor for at least two hours at night, considering a maximum discharge of 75 %, enough to operate the system at the data collection substation. The threshold for pump activation was set at soil moisture below 45 %, with deactivation occurring at or above 80 % to maintain optimal moisture levels. A website was created utilizing the Python Django framework, and an SQLite3 database was implemented, enabling real-time monitoring and remote control of the irrigation pump. Multiple criteria for irrigation were established to enhance the pump's performance so that the developed irrigation system could operate efficiently. This method enables farmers to remotely monitor field conditions and manage irrigation via a website, thereby decreasing reliance on traditional energy sources and reducing water loss during irrigation.</div></div>","PeriodicalId":74813,"journal":{"name":"Smart agricultural technology","volume":"10 ","pages":"Article 100773"},"PeriodicalIF":6.3,"publicationDate":"2025-01-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143181132","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Advancing soybean biomass estimation through multi-source UAV data fusion and machine learning algorithms
IF 6.3
Smart agricultural technology Pub Date : 2025-01-07 DOI: 10.1016/j.atech.2025.100778
Haitao Da , Yaxin Li , Le Xu , Shuai Wang , Limin Hu , Zhengbang Hu , Qiaorong Wei , Rongsheng Zhu , Qingshan Chen , Dawei Xin , Zhenqing Zhao
{"title":"Advancing soybean biomass estimation through multi-source UAV data fusion and machine learning algorithms","authors":"Haitao Da ,&nbsp;Yaxin Li ,&nbsp;Le Xu ,&nbsp;Shuai Wang ,&nbsp;Limin Hu ,&nbsp;Zhengbang Hu ,&nbsp;Qiaorong Wei ,&nbsp;Rongsheng Zhu ,&nbsp;Qingshan Chen ,&nbsp;Dawei Xin ,&nbsp;Zhenqing Zhao","doi":"10.1016/j.atech.2025.100778","DOIUrl":"10.1016/j.atech.2025.100778","url":null,"abstract":"<div><div>Technological advances in unmanned aerial vehicle (UAV) systems offer significant potential for the rapid and efficient monitoring of soybean aboveground biomass (AGB) in precision agriculture, providing an alternative to traditional AGB measurement techniques. However, recent studies have indicated that relying solely on vegetation indices (VIs) can lead to inaccurate AGB estimations due to variability in crop cultivars, growth stages, and environmental conditions. This study evaluated the performance of UAV-derived features (including canopy spectral, textural, and structural features) in estimating AGB across fifty soybean cultivars and multiple growth stages in a two-year field experiment, utilizing various machine learning algorithms (decision tree, DT; random forest, RF; neural network, NN; extreme gradient boosting, XGBoost; and ensemble learning, EL). The findings revealed that: (1) The integration of UAV digital imagery with the canopy height model (CHM) facilitated the estimation of soybean plant height, with the coefficient of determination (R²) and root mean square error (RMSE) values for ground-measured and UAV-derived plant height across different growth stages ranging from 0.72 to 0.88 and 3.35 to 6.13 cm, respectively. (2) Textural and structural features demonstrated good sensitivity to AGB variability across cultivars and growth stages, despite each feature type having its limitations. The fusion of UAV-derived spectral, textural, and structural features yielded the highest accuracy (R² = 0.85), significantly improving model performance compared to using dual (R² ranging from 0.79 to 0.81) feature types. (3) Model accuracy significantly varied across different growth stages. For machine learning algorithms, the EL model outperformed DT, RF, NN, and XGBoost in AGB prediction, consistently providing accurate estimations multiple soybean growth stages. These findings highlight the potential of integrating multi-source UAV features to enhance soybean AGB estimation, facilitating farmers decision-making in precision crop management and assisting breeders to select high- and sustainable-yielding cultivars in large-scale breeding program.</div></div>","PeriodicalId":74813,"journal":{"name":"Smart agricultural technology","volume":"10 ","pages":"Article 100778"},"PeriodicalIF":6.3,"publicationDate":"2025-01-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143181512","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Using internet technology for business entrepreneurial choice: Evidence from Chinese farming households 利用互联网技术促进创业选择:来自中国农户的证据
IF 6.3
Smart agricultural technology Pub Date : 2025-01-07 DOI: 10.1016/j.atech.2025.100775
Baoling Zou , Feiyun Yang , Ashok K. Mishra
{"title":"Using internet technology for business entrepreneurial choice: Evidence from Chinese farming households","authors":"Baoling Zou ,&nbsp;Feiyun Yang ,&nbsp;Ashok K. Mishra","doi":"10.1016/j.atech.2025.100775","DOIUrl":"10.1016/j.atech.2025.100775","url":null,"abstract":"<div><div>Entrepreneurship has the potential to stimulate employment, diversify and increase income, and foster shared prosperity. Rural farm families have harnessed the advantages of digital technologies like the internet by using them for input acquisition, new technologies, production methods, advertising and marketing, income diversification (farm and off-farm work), and entrepreneurial choices. Farm families work off-farm by engaging in self-employment and creating their off-farm businesses (entrepreneurship). This study analyzes farmers’ Internet usage in terms of entrepreneurial decisions using the 2018 China Family Panel Studies survey. The results show that Internet usage significantly promotes farmers’ entrepreneurial choice, and the estimated effects are robust. The mechanism analysis reveals that Internet usage mainly affects farmers’ entrepreneurial choices through financial capital, social networks, and risk preference attitudes. Finally, heterogeneity analysis indicates that Internet usage encourages entrepreneurial choices among farmers with higher education or family income.</div></div>","PeriodicalId":74813,"journal":{"name":"Smart agricultural technology","volume":"10 ","pages":"Article 100775"},"PeriodicalIF":6.3,"publicationDate":"2025-01-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143183259","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Vision transformers for automated detection of pig interactions in groups
IF 6.3
Smart agricultural technology Pub Date : 2025-01-07 DOI: 10.1016/j.atech.2025.100774
Gbadegesin Taiwo, Sunil Vadera, Ali Alameer
{"title":"Vision transformers for automated detection of pig interactions in groups","authors":"Gbadegesin Taiwo,&nbsp;Sunil Vadera,&nbsp;Ali Alameer","doi":"10.1016/j.atech.2025.100774","DOIUrl":"10.1016/j.atech.2025.100774","url":null,"abstract":"<div><div>The interactive behaviour of pigs is an important determinant of their social development and overall well-being. Manual observation and identification of contact behaviour can be time-consuming and potentially subjective. This study presents a new method for the dynamic detection of pig head to rear interaction using the Vision Transformer (ViT). The ViT model achieved a high accuracy in detecting and classifying specific interaction behaviour as trained on the pig contact datasets, capturing interaction behaviour. The model's ability to recognize contextual spatial data enables strong detection even in complex contexts, due to the use of Gaussian Error Linear Unit (GELU) an activation function responsible for introduction of non-linear data to the model and Multi Head Attention feature that ensures all relevant details contained in a data are captured in Vision Transformer. The method provides an efficient method for monitoring swine behaviour for instance, contact between pigs, facilitating better livestock management and livestock welfare. The ViT can represent a significant improvement on current automated behaviour detection, opening new possibilities for accurate animal design and animal behaviour assessment with an accuracy and F1 score of 82.8 % and 82.7 %, respectively, while we have an AUC of 85 %.</div></div>","PeriodicalId":74813,"journal":{"name":"Smart agricultural technology","volume":"10 ","pages":"Article 100774"},"PeriodicalIF":6.3,"publicationDate":"2025-01-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143183258","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
An autonomous obstacle avoidance and path planning method for fruit-picking UAV in orchard environments
IF 6.3
Smart agricultural technology Pub Date : 2025-01-06 DOI: 10.1016/j.atech.2024.100752
Jun Li , Haobo Zhou , Yuju Mai , Yuhang Jia , Zhengqi Zhou , Kaixuan Wu , Hengxu Chen , Hengyi Lin , Mingda Luo , Linlin Shi
{"title":"An autonomous obstacle avoidance and path planning method for fruit-picking UAV in orchard environments","authors":"Jun Li ,&nbsp;Haobo Zhou ,&nbsp;Yuju Mai ,&nbsp;Yuhang Jia ,&nbsp;Zhengqi Zhou ,&nbsp;Kaixuan Wu ,&nbsp;Hengxu Chen ,&nbsp;Hengyi Lin ,&nbsp;Mingda Luo ,&nbsp;Linlin Shi","doi":"10.1016/j.atech.2024.100752","DOIUrl":"10.1016/j.atech.2024.100752","url":null,"abstract":"<div><div>In orchard environments, compared with picking robotic arms, improving the efficiency and safety of the fruit-picking unmanned aerial vehicle (UAV) becomes more challenging. In this paper, an autonomous obstacle avoidance and path planning method based on LiDAR data is proposed for the self-built fruit-picking UAV. First, a LiDAR static-dynamic dual map construction scheme is designed. Using the original point cloud data from LiDAR, a time-accumulated local point cloud map is generated to provide orchard obstacle information for path planning. Then, an improved hybrid A* algorithm based on the B-spline curve is proposed. This algorithm not only comprehensively takes into account the impact of surrounding branches on the flight of the picking UAV near the target fruit bunch, but also ensures that the planned path meets the specific action requirements of the picking UAV when picking the target fruit bunch. The experimental results demonstrate that the proposed map construction scheme significantly reduces the computational power requirements and collision detection time. Moreover, the path planning algorithm effectively guides the UAV and its attached picking actuator to successfully navigate around obstacles, enabling efficient picking of the target fruit bunch. Indicating that the proposed method provides a feasible solution for task execution of the fruit-picking UAV in complex orchard environments.</div></div>","PeriodicalId":74813,"journal":{"name":"Smart agricultural technology","volume":"10 ","pages":"Article 100752"},"PeriodicalIF":6.3,"publicationDate":"2025-01-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143181129","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
A low-cost autonomous portable poultry egg freshness machine using majority voting-based ensemble machine learning classifiers
IF 6.3
Smart agricultural technology Pub Date : 2025-01-06 DOI: 10.1016/j.atech.2025.100768
Jirayut Hansot , Wongsakorn Wongsaroj , Thaksin Sangsuwan , Natee Thong-un
{"title":"A low-cost autonomous portable poultry egg freshness machine using majority voting-based ensemble machine learning classifiers","authors":"Jirayut Hansot ,&nbsp;Wongsakorn Wongsaroj ,&nbsp;Thaksin Sangsuwan ,&nbsp;Natee Thong-un","doi":"10.1016/j.atech.2025.100768","DOIUrl":"10.1016/j.atech.2025.100768","url":null,"abstract":"<div><div>One of the most precise and quick ways for classifying and judging the freshness of agricultural items based on density assessment is water displacement. The use of this approach in agricultural inspections of items like eggs that absorb water, which might be invasive and affect the results of measurements, is currently not recommended. Here, we present a novel automatic machine for low cost, simple and real—time monitoring of the sizing and freshness assessment of eggs based on height and width measurement of yolk using machine learning and a weight sensor. This is the first proposal that divides egg freshness into intervals through height and width measurements. For the purpose of determining the egg's weight, the weighing system was created using a loadcell as the weight sensor. The height and width of the yolk were pictured by two cameras to classify egg freshness. The proposed machine learning model is an ensemble machine learning algorithm, which integrates predictions obtained from several individual classifiers like Random Forest, Decision Trees, Support Vector Machine, Naïve Bayes, <em>k</em>-Nearest Neighbors and Logical Regression to make a final prediction. The proposed Hard voting model improved accuracy and robustness of final prediction compared to a Soft voting classifier. The proposed model obtained an accuracy of 100 % when compared with the typical Yolk index method. This study presents that egg freshness can be determined through yolk dimension without using water to test for water displacement which has future potential as a measuring machine for the poultry industry.</div></div>","PeriodicalId":74813,"journal":{"name":"Smart agricultural technology","volume":"10 ","pages":"Article 100768"},"PeriodicalIF":6.3,"publicationDate":"2025-01-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143183257","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信