{"title":"Human-Robot Cooperative Piano Playing with Learning-Based Real-Time Music Accompaniment","authors":"Huijiang Wang, Xiaoping Zhang, Fumiya Iida","doi":"arxiv-2409.11952","DOIUrl":null,"url":null,"abstract":"Recent advances in machine learning have paved the way for the development of\nmusical and entertainment robots. However, human-robot cooperative instrument\nplaying remains a challenge, particularly due to the intricate motor\ncoordination and temporal synchronization. In this paper, we propose a\ntheoretical framework for human-robot cooperative piano playing based on\nnon-verbal cues. First, we present a music improvisation model that employs a\nrecurrent neural network (RNN) to predict appropriate chord progressions based\non the human's melodic input. Second, we propose a behavior-adaptive controller\nto facilitate seamless temporal synchronization, allowing the cobot to generate\nharmonious acoustics. The collaboration takes into account the bidirectional\ninformation flow between the human and robot. We have developed an\nentropy-based system to assess the quality of cooperation by analyzing the\nimpact of different communication modalities during human-robot collaboration.\nExperiments demonstrate that our RNN-based improvisation can achieve a 93\\%\naccuracy rate. Meanwhile, with the MPC adaptive controller, the robot could\nrespond to the human teammate in homophony performances with real-time\naccompaniment. Our designed framework has been validated to be effective in\nallowing humans and robots to work collaboratively in the artistic\npiano-playing task.","PeriodicalId":501031,"journal":{"name":"arXiv - CS - Robotics","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-09-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Robotics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.11952","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Recent advances in machine learning have paved the way for the development of
musical and entertainment robots. However, human-robot cooperative instrument
playing remains a challenge, particularly due to the intricate motor
coordination and temporal synchronization. In this paper, we propose a
theoretical framework for human-robot cooperative piano playing based on
non-verbal cues. First, we present a music improvisation model that employs a
recurrent neural network (RNN) to predict appropriate chord progressions based
on the human's melodic input. Second, we propose a behavior-adaptive controller
to facilitate seamless temporal synchronization, allowing the cobot to generate
harmonious acoustics. The collaboration takes into account the bidirectional
information flow between the human and robot. We have developed an
entropy-based system to assess the quality of cooperation by analyzing the
impact of different communication modalities during human-robot collaboration.
Experiments demonstrate that our RNN-based improvisation can achieve a 93\%
accuracy rate. Meanwhile, with the MPC adaptive controller, the robot could
respond to the human teammate in homophony performances with real-time
accompaniment. Our designed framework has been validated to be effective in
allowing humans and robots to work collaboratively in the artistic
piano-playing task.