Lorenz Schwittmann, V. Matkovic, Matthäus Wander, Torben Weis
{"title":"Video recognition using ambient light sensors","authors":"Lorenz Schwittmann, V. Matkovic, Matthäus Wander, Torben Weis","doi":"10.1109/PERCOM.2016.7456511","DOIUrl":null,"url":null,"abstract":"We present a method for recognizing a video that is playing on a TV screen by sampling the ambient light sensor of a user's smartphone. This improves situation awareness in pervasive systems because the phone can determine what the user is currently watching on TV. Our method works even if the phone has no direct line of sight to the TV screen, since ambient light reflected from walls is sufficient. Our evaluation shows that a 100% recognition ratio of the current TV channel is possible by sampling a sequence of 15 to 120 seconds length, depending on more or less favorable measuring conditions. In addition, we evaluated the recognition ratio when the user is watching video-on-demand, which exhibits a large set of possible videos. Recognizing professional YouTube videos resulted in a 92% recognition ratio; amateur videos were recognized correctly with 60% because these videos have fewer cuts. Our method focuses on detecting the time difference between video cuts because the light emitted by the screen changes instantly with most cuts and this is easily measurable with the ambient light sensor. Using the ambient light sensor instead of the camera greatly benefits energy consumption, bandwidth usage and raises less privacy concerns. Hence, it is feasible to run the measurement in the background for a longer time without draining the battery and without sending camera shots to a remote server for analysis.","PeriodicalId":275797,"journal":{"name":"2016 IEEE International Conference on Pervasive Computing and Communications (PerCom)","volume":"5 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"11","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2016 IEEE International Conference on Pervasive Computing and Communications (PerCom)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/PERCOM.2016.7456511","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 11
Abstract
We present a method for recognizing a video that is playing on a TV screen by sampling the ambient light sensor of a user's smartphone. This improves situation awareness in pervasive systems because the phone can determine what the user is currently watching on TV. Our method works even if the phone has no direct line of sight to the TV screen, since ambient light reflected from walls is sufficient. Our evaluation shows that a 100% recognition ratio of the current TV channel is possible by sampling a sequence of 15 to 120 seconds length, depending on more or less favorable measuring conditions. In addition, we evaluated the recognition ratio when the user is watching video-on-demand, which exhibits a large set of possible videos. Recognizing professional YouTube videos resulted in a 92% recognition ratio; amateur videos were recognized correctly with 60% because these videos have fewer cuts. Our method focuses on detecting the time difference between video cuts because the light emitted by the screen changes instantly with most cuts and this is easily measurable with the ambient light sensor. Using the ambient light sensor instead of the camera greatly benefits energy consumption, bandwidth usage and raises less privacy concerns. Hence, it is feasible to run the measurement in the background for a longer time without draining the battery and without sending camera shots to a remote server for analysis.