{"title":"M2S2: A Multimodal Sensor System for Remote Animal Motion Capture in the Wild","authors":"Azraa Vally;Gerald Maswoswere;Nicholas Bowden;Stephen Paine;Paul Amayo;Andrew Markham;Amir Patel","doi":"10.1109/LSENS.2025.3542233","DOIUrl":null,"url":null,"abstract":"Capturing animal locomotion in the wild is far more challenging than in controlled laboratory settings. Wildlife subjects move unpredictably, and issues, such as scaling, occlusion, lighting changes, and the lack of ground truth data, make motion capture difficult. Unlike human biomechanics, where machine learning thrives with annotated datasets, such resources are scarce for wildlife. Multimodal sensing offers a solution by combining the strengths of various sensors, such as Light Detection and Ranging {LiDAR) and thermal cameras, to compensate for individual sensor limitations. In addition, some sensors, like LiDAR, can provide training data for monocular pose estimation models. We introduce a multimodal sensor system (M2S2) for capturing animal motion in the wild. M2S2 integrates RGB, depth, thermal, event, LiDAR, and acoustic sensors to overcome challenges like synchronization and calibration. We showcase its application with data from cheetahs, offering a new resource for advancing sensor fusion algorithms in wildlife motion capture.","PeriodicalId":13014,"journal":{"name":"IEEE Sensors Letters","volume":"9 4","pages":"1-4"},"PeriodicalIF":2.2000,"publicationDate":"2025-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Sensors Letters","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10887236/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0
Abstract
Capturing animal locomotion in the wild is far more challenging than in controlled laboratory settings. Wildlife subjects move unpredictably, and issues, such as scaling, occlusion, lighting changes, and the lack of ground truth data, make motion capture difficult. Unlike human biomechanics, where machine learning thrives with annotated datasets, such resources are scarce for wildlife. Multimodal sensing offers a solution by combining the strengths of various sensors, such as Light Detection and Ranging {LiDAR) and thermal cameras, to compensate for individual sensor limitations. In addition, some sensors, like LiDAR, can provide training data for monocular pose estimation models. We introduce a multimodal sensor system (M2S2) for capturing animal motion in the wild. M2S2 integrates RGB, depth, thermal, event, LiDAR, and acoustic sensors to overcome challenges like synchronization and calibration. We showcase its application with data from cheetahs, offering a new resource for advancing sensor fusion algorithms in wildlife motion capture.