Pasquale Lisena, Raphael Troncy, Konstantin Todorov, Manel Achichi
{"title":"Modeling the Complexity of Music Metadata in Semantic Graphs for Exploration and Discovery","authors":"Pasquale Lisena, Raphael Troncy, Konstantin Todorov, Manel Achichi","doi":"10.1145/3144749.3144754","DOIUrl":"https://doi.org/10.1145/3144749.3144754","url":null,"abstract":"Representing and retrieving fine-grained information related to something as complex as music composition, recording and performance is a challenging activity. This complexity requires that the data model enables to describe different outcomes of the creative process, from the writing of the score, to its performance and publishing. In this paper, we show how we design the DOREMUS ontology as an extension of the FRBRoo model in order to represent music metadata coming from different libraries and cultural institutions and how we publish this data as RDF graphs. We designed and re-used several controlled vocabularies that provide common identifiers that overcome the differences in language and alternative forms of needed concepts. These graphs are interlinked to each other and to external resources on the Web of Data. We show how these graphs can be walked through for designing a web-based application providing an exploratory search engine for presenting complex music metadata to the end-user. Finally, we demonstrate how this model and this exploratory application is suitable for answering non-trivial questions collected from experts and is a first step towards a fully fledged recommendation engine.","PeriodicalId":134943,"journal":{"name":"Proceedings of the 4th International Workshop on Digital Libraries for Musicology","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127216271","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Non-chord Tone Identification Using Deep Neural Networks","authors":"Yaolong Ju, Nathaniel Condit-Schultz, Claire Arthur, Ichiro Fujinaga","doi":"10.1145/3144749.3144753","DOIUrl":"https://doi.org/10.1145/3144749.3144753","url":null,"abstract":"This paper addresses the problem of harmonic analysis by proposing a non-chord tone identification model using deep neural network (DNN). By identifying non-chord tones, the task of harmonic analysis is much simplified. Trained and tested on a dataset of 140 Bach chorales, an initial DNN was able to identify non-chord tones with F1-measure of 57.00%, using pitch-class information alone. By adding metric information, a small size contextual window, and fine-tuning DNN, the model's accuracy increased to a F1-measure of 72.19%. These results suggest that DNNs offer an innovative and promising approach to tackling the problem of non-chord tone identification, as well as harmonic analysis.","PeriodicalId":134943,"journal":{"name":"Proceedings of the 4th International Workshop on Digital Libraries for Musicology","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130486109","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
R. Valk, A. Volk, A. Holzapfel, A. Pikrakis, N. Kroher, Joren Six
{"title":"MIRchiving: Challenges and opportunities of connecting MIR research and digital music archives","authors":"R. Valk, A. Volk, A. Holzapfel, A. Pikrakis, N. Kroher, Joren Six","doi":"10.1145/3144749.3144755","DOIUrl":"https://doi.org/10.1145/3144749.3144755","url":null,"abstract":"This study is a call for action for the music information retrieval (MIR) community to pay more attention to collaboration with digital music archives. The study, which resulted from an interdisciplinary workshop and subsequent discussion, matches the demand for MIR technologies from various archives with what is already supplied by the MIR community. We conclude that the expressed demands can only be served sustainably through closer collaborations. Whereas MIR systems are described in scientific publications, usable implementations are often absent. If there is a runnable system, user documentation is often sparse---posing a huge hurdle for archivists to employ it. This study sheds light on the current limitations and opportunities of MIR research in the context of music archives by means of examples, and highlights available tools. As a basic guideline for collaboration, we propose to interpret MIR research as part of a value chain. We identify the following benefits of collaboration between MIR researchers and music archives: new perspectives for content access in archives, more diverse evaluation data and methods, and a more application-oriented MIR research workflow.","PeriodicalId":134943,"journal":{"name":"Proceedings of the 4th International Workshop on Digital Libraries for Musicology","volume":"3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127624843","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Tools for Music Bibliographic Network Analysis","authors":"Andrew Horwitz, Yun Fan, Richard Brown","doi":"10.1145/3144749.3144761","DOIUrl":"https://doi.org/10.1145/3144749.3144761","url":null,"abstract":"Many metadata repositories have been used to enrich music information retrieval applications, focusing on topics such as recorded music or performance catalogues. We present a database of bibliographic information that focuses on indexing writings on music, containing expert-curated subject terms from a controlled vocabulary, embedded inter-lingual equivalencies of these terms, and links that connect works and terms with canonical reference identifiers. An ontology of these terms, as well as an API to programatically access them, is under development for use in exploring and analyzing musicological bibliographic networks across domains, languages, and cultures.","PeriodicalId":134943,"journal":{"name":"Proceedings of the 4th International Workshop on Digital Libraries for Musicology","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130142519","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Applications of duplicate detection: linking meta-data and merging music archives: The experience of the IPEM historical archive of electronic music","authors":"F. Bressan, Joren Six, M. Leman","doi":"10.1145/3144749.3144759","DOIUrl":"https://doi.org/10.1145/3144749.3144759","url":null,"abstract":"This work focuses on applications of duplicate detection for managing digital music archives. It aims to make this mature music information retrieval (MIR) technology better known to archivists and provide clear suggestions on how this technology can be used in practice. More specifically applications are discussed to complement meta-data, to link or merge digital music archives, to improve listening experiences and to re-use segmentation data. The IPEM archive, a digitized music archive containing early electronic music, provides a case study.","PeriodicalId":134943,"journal":{"name":"Proceedings of the 4th International Workshop on Digital Libraries for Musicology","volume":"26 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132031661","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Quantitative analysis of the relationship between linguistic tones and melody in jingju using music scores","authors":"Rafael Caro Repetto, Shuo Zhang, Xavier Serra","doi":"10.1145/3144749.3144758","DOIUrl":"https://doi.org/10.1145/3144749.3144758","url":null,"abstract":"When lyrics of tonal languages are set to music, the pitch contour of the tones has to agree to a certain extent with the melodic contour to assure intelligibility. The relationship between the linguistic tones of the complex dialectal construct used in jingju (commonly known as Beijing or Peking opera) and its melody has been largely studied, but not definite consensus has been achieved among scholars. After reviewing the related literature, we present a first approach for the quantitative analysis of the relationship between linguistic tones and melody in jingju using a collection of machine readable music scores with tone category annotations for 7,283 syllables. We describe two statistical analyses performed in this collection regarding the melodic contour for each syllable and the pitch height relationship in 5,494 pairs of consecutive syllables. We argue that the obtained results contribute to supporting claims from the literature and complementing others, although some limitations of the approach might nuance the confidence of their validity.","PeriodicalId":134943,"journal":{"name":"Proceedings of the 4th International Workshop on Digital Libraries for Musicology","volume":"28 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121958741","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The Comprehensive Application of Vmus.net for Musical Performance Studies","authors":"Jian Yang","doi":"10.1145/3144749.3144750","DOIUrl":"https://doi.org/10.1145/3144749.3144750","url":null,"abstract":"There are two main difficulties in applying empirical approaches such as MIR (Music Information Retrieval) methods in music research. The first is the complexity of the related software and the other is the integration between empirical findings and musicological context. To solve these problems, an easy-to-use online application Vmus.net is introduced with three typical examples combining visualization methods such as IOI deviation and dynamic curve, tempo-dynamic curve, and performance worm with sensible musical and historical considerations. Compared with similar desktop software, Vmus.net is a relatively compact tool still under construction with a growing number of users and datasets. It is presumed that simplifying the analytical process and providing printer-friendly charts for direct quoting might help to persuade more researchers to deeply embed the relevant technology in their musicological investigations1.","PeriodicalId":134943,"journal":{"name":"Proceedings of the 4th International Workshop on Digital Libraries for Musicology","volume":"87 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121215446","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
David Garfinkle, Claire Arthur, Peter Schubert, Julie Cumming, Ichiro Fujinaga
{"title":"PatternFinder: Content-Based Music Retrieval with music21","authors":"David Garfinkle, Claire Arthur, Peter Schubert, Julie Cumming, Ichiro Fujinaga","doi":"10.1145/3144749.3144751","DOIUrl":"https://doi.org/10.1145/3144749.3144751","url":null,"abstract":"Content-Based Music Retrieval (CBMR) for symbolic music aims to find all similar occurrences of a musical pattern within a larger database of symbolic music. To the best of our knowledge there does not currently exist a distributable CBMR software package integrated with a music analysis toolkit that facilitates extendability with new CBMR methods. This project presents a new MIR tool called \"PatternFinder\" satisfying these goals. PatternFinder is built with the computational musicology Python package music21, which provides a flexible platform capable of working with many music notation formats. To achieve polyphonic CBMR, we implement seven geometric algorithms developed at the University of Helsinki---four of which are being implemented and released publicly for the first time. The application of our MIR tool is then demonstrated through a musicological investigation of Renaissance imitation masses, which borrow melodic or contrapuntal material from a pre-existing musical work. In addition, we show Pattern-Finder's ability to find a contrapuntal pattern over a large dataset, Palestrina's 104 masses. Our investigations demonstrate the relevance of our tool for musicological research as well as its potential application for locating music within digital music libraries.","PeriodicalId":134943,"journal":{"name":"Proceedings of the 4th International Workshop on Digital Libraries for Musicology","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130861634","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Michael D. Barone, Kurt Dacosta, Gabriel Vigliensoni, M. Woolhouse
{"title":"GRAIL: Database Linking Music Metadata Across Artist, Release, and Track","authors":"Michael D. Barone, Kurt Dacosta, Gabriel Vigliensoni, M. Woolhouse","doi":"10.1145/3144749.3144760","DOIUrl":"https://doi.org/10.1145/3144749.3144760","url":null,"abstract":"Linking information from multiple music databases is important for MIR because it provides a means to determine consistency of metadata between resources/services, which can help facilitate innovative product development and research. However, as yet, no open access tools exist that persistently link and validate metadata resources at the three main entities of music data: artist, release, and track. This paper introduces an open access resource which attempts to address the issue of linking information from multiple music databases. The General Recorded Audio Identity Linker (GRAIL - api.digitalmusiclab.org) is a music metadata ID-linking API that: i) connects International Standard Recording Codes (ISRCs) to music metadata IDs from services such as MusicBrainz, Spotify, and Last.FM; ii) provides these ID linkages as a publicly available resource; iii) confirms linkage accuracy using continuous metadata crawling from music-service APIs; and iv) derives consistency values (CV) for linkages by means of a set of quantifiable criteria. To date, more than 35M tracks, 8M releases, and 900K artists from 16 services have been ingested into GRAIL. We discuss the challenges faced in past attempts to link music metadata, the methods and rationale which we adopted in order to construct GRAIL and to ensure it remains updated with validated information.","PeriodicalId":134943,"journal":{"name":"Proceedings of the 4th International Workshop on Digital Libraries for Musicology","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131306546","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Emilia Parada-Cabaleiro, Alice Baird, A. Batliner, N. Cummins, Simone Hantke, Björn Schuller
{"title":"The Perception of Emotion in the Singing Voice: The Understanding of Music Mood for Music Organisation","authors":"Emilia Parada-Cabaleiro, Alice Baird, A. Batliner, N. Cummins, Simone Hantke, Björn Schuller","doi":"10.1145/3144749.3144756","DOIUrl":"https://doi.org/10.1145/3144749.3144756","url":null,"abstract":"With the increased usage of internet based services and the mass of digital content now available online, the organisation of such content has become a major topic of interest both commercially and within academic research. The addition of emotional understanding for the content is a relevant parameter not only for music classification within digital libraries but also for improving users experiences, via services including automated music recommendation. Despite the singing voice being well-known for the natural communication of emotion, it is still unclear which specific musical characteristics of this signal are involved such affective expressions. The presented study investigates which musical parameters of singing relate to the emotional content, by evaluating the perception of emotion in electronically manipulated a cappella audio samples. A group of 24 individuals participated in a perception test evaluating the emotional dimensions of arousal and valence of 104 sung instances. Key results presented indicate that the rhythmic-melodic contour is potentially related to the perception of arousal whereas musical syntax and tempo can alter the perception of valence.","PeriodicalId":134943,"journal":{"name":"Proceedings of the 4th International Workshop on Digital Libraries for Musicology","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125281553","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}