{"title":"ARM cortex M4-based extensible multimodal wearable platform for sensor research and context sensing from motion & sound","authors":"D. Roggen","doi":"10.1145/3410530.3414368","DOIUrl":null,"url":null,"abstract":"We present an extensible sensor research platform suitable for motion- and sound-based activity and context recognition in wearable and ubiquitous computing applications. The 30x30mm platform is extensible through plug-in boards, which makes it well suited to explore novel sensor technologies. Its firmware can acquire 9-axis inertial measurement unit (IMU) data and device orientation in quaternions at up to 565Hz, sound at 16KHz and external analog inputs, without any programming, allowing for use by non-experts. The data of distinct modalities can be acquired in isolation or simultaneously for multimodal sensing, and can be streamed over Bluetooth or stored locally. The platform has a real-time clock, which enables the acquisition of the data from multiple nodes with a ±10ppm frequency tolerance, without requiring inter-node connectivity. This is useful to collect data from multiple people. Acquiring multimodal data, the measured power consumption is 222mW when streaming and 67mW when logging to an SD card. With a 165mAh battery, this leads to 2h15mn and 9h of operation, respectively, with a weight of 10.8g (6.75g without battery).","PeriodicalId":7183,"journal":{"name":"Adjunct Proceedings of the 2020 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2020 ACM International Symposium on Wearable Computers","volume":"12 10 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2020-09-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Adjunct Proceedings of the 2020 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2020 ACM International Symposium on Wearable Computers","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3410530.3414368","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4
Abstract
We present an extensible sensor research platform suitable for motion- and sound-based activity and context recognition in wearable and ubiquitous computing applications. The 30x30mm platform is extensible through plug-in boards, which makes it well suited to explore novel sensor technologies. Its firmware can acquire 9-axis inertial measurement unit (IMU) data and device orientation in quaternions at up to 565Hz, sound at 16KHz and external analog inputs, without any programming, allowing for use by non-experts. The data of distinct modalities can be acquired in isolation or simultaneously for multimodal sensing, and can be streamed over Bluetooth or stored locally. The platform has a real-time clock, which enables the acquisition of the data from multiple nodes with a ±10ppm frequency tolerance, without requiring inter-node connectivity. This is useful to collect data from multiple people. Acquiring multimodal data, the measured power consumption is 222mW when streaming and 67mW when logging to an SD card. With a 165mAh battery, this leads to 2h15mn and 9h of operation, respectively, with a weight of 10.8g (6.75g without battery).