Diego Miranda , Jaime Godoy , Rene Noel , Cristian Cechinel , Roberto Munoz
{"title":"MultiCoCoA: Multimodal data collector from collocated collaborative activities","authors":"Diego Miranda , Jaime Godoy , Rene Noel , Cristian Cechinel , Roberto Munoz","doi":"10.1016/j.softx.2025.102202","DOIUrl":null,"url":null,"abstract":"<div><div>Collaborative work requires developing and applying soft skills that can influence the formation of social, emotional, and professional skills. Nevertheless, assessing the effectiveness of teamwork, collaboration, and communication is challenging and is commonly addressed by qualitative research approaches. Although through isolated initiatives, Multimodal Learning Analytics (MMLA) techniques have successfully addressed the challenge of measuring different communication features. This work presents MultiCoCoA, a multimodal analytics framework to facilitate data collection in collaborative activities. MultiCoCoA integrates state-of-the-art MMLA techniques and machine learning techniques to analyze audio and video data, and it can help identify areas to improve communication skills. MultiCoCoA allows data to be uploaded and analyzed intuitively, presenting the results through data visualization features and downloadable CSV files for its use with data analysis tools. To evaluate MultiCoCoA’s performance, we conducted both technical validation and user feedback analysis. In terms of accuracy, the system was tested using over 5700 manually labeled video frames from two sessions of collaborative software planning, achieving 92.85% precision in detecting spoken interventions, 85.59% in direction-of-arrival estimation, and 74.88% for identifying observer–observed gaze pairs. To assess usability, we applied the System Usability Scale with five professionals in software development roles, obtaining a favorable usability perception and highlighting ease of use and functional integration, alongside contextual suggestions for deployment in dynamic work environments. The expected outcome of MultiCoCoA is to support research in communication and collaboration, providing quantitative insights to complement existing research methods.</div></div>","PeriodicalId":21905,"journal":{"name":"SoftwareX","volume":"31 ","pages":"Article 102202"},"PeriodicalIF":2.4000,"publicationDate":"2025-05-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"SoftwareX","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2352711025001694","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, SOFTWARE ENGINEERING","Score":null,"Total":0}
引用次数: 0
Abstract
Collaborative work requires developing and applying soft skills that can influence the formation of social, emotional, and professional skills. Nevertheless, assessing the effectiveness of teamwork, collaboration, and communication is challenging and is commonly addressed by qualitative research approaches. Although through isolated initiatives, Multimodal Learning Analytics (MMLA) techniques have successfully addressed the challenge of measuring different communication features. This work presents MultiCoCoA, a multimodal analytics framework to facilitate data collection in collaborative activities. MultiCoCoA integrates state-of-the-art MMLA techniques and machine learning techniques to analyze audio and video data, and it can help identify areas to improve communication skills. MultiCoCoA allows data to be uploaded and analyzed intuitively, presenting the results through data visualization features and downloadable CSV files for its use with data analysis tools. To evaluate MultiCoCoA’s performance, we conducted both technical validation and user feedback analysis. In terms of accuracy, the system was tested using over 5700 manually labeled video frames from two sessions of collaborative software planning, achieving 92.85% precision in detecting spoken interventions, 85.59% in direction-of-arrival estimation, and 74.88% for identifying observer–observed gaze pairs. To assess usability, we applied the System Usability Scale with five professionals in software development roles, obtaining a favorable usability perception and highlighting ease of use and functional integration, alongside contextual suggestions for deployment in dynamic work environments. The expected outcome of MultiCoCoA is to support research in communication and collaboration, providing quantitative insights to complement existing research methods.
期刊介绍:
SoftwareX aims to acknowledge the impact of software on today''s research practice, and on new scientific discoveries in almost all research domains. SoftwareX also aims to stress the importance of the software developers who are, in part, responsible for this impact. To this end, SoftwareX aims to support publication of research software in such a way that: The software is given a stamp of scientific relevance, and provided with a peer-reviewed recognition of scientific impact; The software developers are given the credits they deserve; The software is citable, allowing traditional metrics of scientific excellence to apply; The academic career paths of software developers are supported rather than hindered; The software is publicly available for inspection, validation, and re-use. Above all, SoftwareX aims to inform researchers about software applications, tools and libraries with a (proven) potential to impact the process of scientific discovery in various domains. The journal is multidisciplinary and accepts submissions from within and across subject domains such as those represented within the broad thematic areas below: Mathematical and Physical Sciences; Environmental Sciences; Medical and Biological Sciences; Humanities, Arts and Social Sciences. Originating from these broad thematic areas, the journal also welcomes submissions of software that works in cross cutting thematic areas, such as citizen science, cybersecurity, digital economy, energy, global resource stewardship, health and wellbeing, etcetera. SoftwareX specifically aims to accept submissions representing domain-independent software that may impact more than one research domain.