{"title":"Music Recommendation System through Hand Gestures and Facial Emotions","authors":"Meeta Chaudhry, Sunil Kumar, Suhail Qadir Ganie","doi":"10.1109/ISCON57294.2023.10112159","DOIUrl":null,"url":null,"abstract":"Music can be a powerful tool to describe the human mood. Hand Gestures and Facial emotions are forms of fast non-linguistic communication. The current research on Music recommendation either using a hand gesture music controller (that only controls the operations for playing music) or an emotion based music player but not both. In this work, a new and hybrid approach for playing music both using hand gestures and facial emotions is proposed that can help the user to recommend and play music. In this research facial expression recognizer(FER) algorithm is used that extract the features from the image for emotion detection and the MediaPipe framework and Tensorflow library are used for hand detection and gesture recognition respectively. The music will play based on the most recent gesture and emotion by using a pygame. First, priority is given to hand gestures and then to facial emotions. The accuracy of the proposed work is also compared with existing approaches to music recommendation.","PeriodicalId":280183,"journal":{"name":"2023 6th International Conference on Information Systems and Computer Networks (ISCON)","volume":"18 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-03-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 6th International Conference on Information Systems and Computer Networks (ISCON)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISCON57294.2023.10112159","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
Music can be a powerful tool to describe the human mood. Hand Gestures and Facial emotions are forms of fast non-linguistic communication. The current research on Music recommendation either using a hand gesture music controller (that only controls the operations for playing music) or an emotion based music player but not both. In this work, a new and hybrid approach for playing music both using hand gestures and facial emotions is proposed that can help the user to recommend and play music. In this research facial expression recognizer(FER) algorithm is used that extract the features from the image for emotion detection and the MediaPipe framework and Tensorflow library are used for hand detection and gesture recognition respectively. The music will play based on the most recent gesture and emotion by using a pygame. First, priority is given to hand gestures and then to facial emotions. The accuracy of the proposed work is also compared with existing approaches to music recommendation.