Pragyan Mohapatra, A. Ramaswamy, J. Gubbi, Achanna Anil Kumar, P. Misra
{"title":"Aerial Drones with Direction Sensitive DeepEars","authors":"Pragyan Mohapatra, A. Ramaswamy, J. Gubbi, Achanna Anil Kumar, P. Misra","doi":"10.1145/3213526.3213530","DOIUrl":null,"url":null,"abstract":"Recent years have seen a huge increase in the use of small unmanned aircrafts, otherwise known as micro aerial vehicles (MAVs), in a variety of monitoring applications. With regard to applications in emergency response, acoustic sensing plays a key role in locating sound emitting targets (e.g., a person in distress); especially in visually occluded environment. Our endeavour, therefore, is to provision the MAV Ears; and as part of the initial prerequisite, our aim is to develop a robust acoustic direction finding system. In order to achieve autonomous MAV navigation using acoustic signal, a direction-of-arrival (DoA) mechanism that is robust in low signal-to-noise ratio (SNR) conditions becomes a necessity. In this paper, we propose Drone-DeepEars: a new DoA estimation framework based on deep learning and optimized for sensor arrays with fewer sensing elements. We show that its DoA estimation accuracy is comparatively better than state-of-the-art techniques (such as MUSIC and ESPIRIT) at high noise levels, but at a relatively lower computational footprint.","PeriodicalId":237910,"journal":{"name":"Proceedings of the 4th ACM Workshop on Micro Aerial Vehicle Networks, Systems, and Applications","volume":"26 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-06-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 4th ACM Workshop on Micro Aerial Vehicle Networks, Systems, and Applications","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3213526.3213530","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Recent years have seen a huge increase in the use of small unmanned aircrafts, otherwise known as micro aerial vehicles (MAVs), in a variety of monitoring applications. With regard to applications in emergency response, acoustic sensing plays a key role in locating sound emitting targets (e.g., a person in distress); especially in visually occluded environment. Our endeavour, therefore, is to provision the MAV Ears; and as part of the initial prerequisite, our aim is to develop a robust acoustic direction finding system. In order to achieve autonomous MAV navigation using acoustic signal, a direction-of-arrival (DoA) mechanism that is robust in low signal-to-noise ratio (SNR) conditions becomes a necessity. In this paper, we propose Drone-DeepEars: a new DoA estimation framework based on deep learning and optimized for sensor arrays with fewer sensing elements. We show that its DoA estimation accuracy is comparatively better than state-of-the-art techniques (such as MUSIC and ESPIRIT) at high noise levels, but at a relatively lower computational footprint.