Anton Louise P. De Ocampo , Francis Jesmar P. Montalbo
{"title":"用于同时实时监测农民活动和作物健康状况的无人机多视角监测框架","authors":"Anton Louise P. De Ocampo , Francis Jesmar P. Montalbo","doi":"10.1016/j.atech.2024.100466","DOIUrl":null,"url":null,"abstract":"<div><p>Current remote sensing technologies employing Unmanned Aerial Vehicles (UAVs) for farm monitoring have shown promise in characterizing the environment through diverse sensor systems, including hyperspectral cameras, LiDAR, thermal cameras, and RGB sensors. However, these solutions often specialize in either activity recognition or crop monitoring, but not both. To address this limitation and enhance efficacy, we propose a multi-vision monitoring (MVM) framework capable of simultaneously recognizing farm activities and assessing crop health. Our approach involves computer vision techniques that transform aerial videos into sequential images to extract essential environmental features. Central to our framework are two pivotal components: the Farmer Activity Recognition (FAR) algorithm and the Crop Image Analysis (CIA). The FAR algorithm introduces a novel feature extraction method capturing motion across various maps, enabling distinct feature sets for each activity. Meanwhile, the CIA component utilizes the normalized Triangular Greenness Index (nTGI) to estimate leave chlorophyll levels, an important indicator for crop health. By unifying these components, we achieve dual functionality—activity recognition and crop health estimation—using identical input data, thereby enhancing efficiency and versatility in farm monitoring. Our framework employs a diverse range of machine learning models, demonstrating the potential of our extracted features to address the defined problem effectively in unison.</p></div>","PeriodicalId":74813,"journal":{"name":"Smart agricultural technology","volume":null,"pages":null},"PeriodicalIF":6.3000,"publicationDate":"2024-05-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2772375524000716/pdfft?md5=7e78eee8a5852881b491e370c13b56d7&pid=1-s2.0-S2772375524000716-main.pdf","citationCount":"0","resultStr":"{\"title\":\"A multi-vision monitoring framework for simultaneous real-time unmanned aerial monitoring of farmer activity and crop health\",\"authors\":\"Anton Louise P. De Ocampo , Francis Jesmar P. Montalbo\",\"doi\":\"10.1016/j.atech.2024.100466\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>Current remote sensing technologies employing Unmanned Aerial Vehicles (UAVs) for farm monitoring have shown promise in characterizing the environment through diverse sensor systems, including hyperspectral cameras, LiDAR, thermal cameras, and RGB sensors. However, these solutions often specialize in either activity recognition or crop monitoring, but not both. To address this limitation and enhance efficacy, we propose a multi-vision monitoring (MVM) framework capable of simultaneously recognizing farm activities and assessing crop health. Our approach involves computer vision techniques that transform aerial videos into sequential images to extract essential environmental features. Central to our framework are two pivotal components: the Farmer Activity Recognition (FAR) algorithm and the Crop Image Analysis (CIA). The FAR algorithm introduces a novel feature extraction method capturing motion across various maps, enabling distinct feature sets for each activity. Meanwhile, the CIA component utilizes the normalized Triangular Greenness Index (nTGI) to estimate leave chlorophyll levels, an important indicator for crop health. By unifying these components, we achieve dual functionality—activity recognition and crop health estimation—using identical input data, thereby enhancing efficiency and versatility in farm monitoring. Our framework employs a diverse range of machine learning models, demonstrating the potential of our extracted features to address the defined problem effectively in unison.</p></div>\",\"PeriodicalId\":74813,\"journal\":{\"name\":\"Smart agricultural technology\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":6.3000,\"publicationDate\":\"2024-05-16\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.sciencedirect.com/science/article/pii/S2772375524000716/pdfft?md5=7e78eee8a5852881b491e370c13b56d7&pid=1-s2.0-S2772375524000716-main.pdf\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Smart agricultural technology\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S2772375524000716\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"AGRICULTURAL ENGINEERING\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Smart agricultural technology","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2772375524000716","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AGRICULTURAL ENGINEERING","Score":null,"Total":0}
A multi-vision monitoring framework for simultaneous real-time unmanned aerial monitoring of farmer activity and crop health
Current remote sensing technologies employing Unmanned Aerial Vehicles (UAVs) for farm monitoring have shown promise in characterizing the environment through diverse sensor systems, including hyperspectral cameras, LiDAR, thermal cameras, and RGB sensors. However, these solutions often specialize in either activity recognition or crop monitoring, but not both. To address this limitation and enhance efficacy, we propose a multi-vision monitoring (MVM) framework capable of simultaneously recognizing farm activities and assessing crop health. Our approach involves computer vision techniques that transform aerial videos into sequential images to extract essential environmental features. Central to our framework are two pivotal components: the Farmer Activity Recognition (FAR) algorithm and the Crop Image Analysis (CIA). The FAR algorithm introduces a novel feature extraction method capturing motion across various maps, enabling distinct feature sets for each activity. Meanwhile, the CIA component utilizes the normalized Triangular Greenness Index (nTGI) to estimate leave chlorophyll levels, an important indicator for crop health. By unifying these components, we achieve dual functionality—activity recognition and crop health estimation—using identical input data, thereby enhancing efficiency and versatility in farm monitoring. Our framework employs a diverse range of machine learning models, demonstrating the potential of our extracted features to address the defined problem effectively in unison.