Diego Miranda , Carlos Escobedo , Dayana Palma , Rene Noel , Adrián Fernández , Cristian Cechinel , Jaime Godoy , Roberto Munoz
{"title":"敏捷软件开发团队交互的多模态实验数据集","authors":"Diego Miranda , Carlos Escobedo , Dayana Palma , Rene Noel , Adrián Fernández , Cristian Cechinel , Jaime Godoy , Roberto Munoz","doi":"10.1016/j.dib.2025.111828","DOIUrl":null,"url":null,"abstract":"<div><div>Studying collaborative dynamics in agile development teams requires multi- modal data that captures verbal and non-verbal communication. However, few experimental datasets provide this level of depth in real or simulated teamwork contexts. This article presents a multimodal dataset with experimental data collected during controlled sessions involving simulated agile development teams, each composed of four computer science students. A total of 19 groups (76 different participants) were organized, each participating in two collaborative activities: one without a coordination technique and another using the Planning Poker method. Three of these teams were designated as control groups. The resulting dataset includes audio recordings of verbal interactions and non- verbal behaviour data, such as body posture, facial expressions, visual attention, and gestures, captured using MediaPipe, YOLOv8, and DeepSort. It also contains time-aligned automatic transcriptions generated with WhisperX, attention logs, mimicry labels, and surveys on perceived equity in interactions. This re- source aims to provide a comprehensive view of collaborative behaviour in agile contexts, supporting both qualitative analysis of interactions and the development of predictive models of group performance. The dataset explores how shared visual attention and behavioural synchrony influence team effectiveness and decision-making through this multimodal approach. This work contributes a unique dataset valuable to researchers across multiple fields of study.</div></div>","PeriodicalId":10973,"journal":{"name":"Data in Brief","volume":"61 ","pages":"Article 111828"},"PeriodicalIF":1.4000,"publicationDate":"2025-06-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A multimodal experimental dataset on agile software development team interactions\",\"authors\":\"Diego Miranda , Carlos Escobedo , Dayana Palma , Rene Noel , Adrián Fernández , Cristian Cechinel , Jaime Godoy , Roberto Munoz\",\"doi\":\"10.1016/j.dib.2025.111828\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Studying collaborative dynamics in agile development teams requires multi- modal data that captures verbal and non-verbal communication. However, few experimental datasets provide this level of depth in real or simulated teamwork contexts. This article presents a multimodal dataset with experimental data collected during controlled sessions involving simulated agile development teams, each composed of four computer science students. A total of 19 groups (76 different participants) were organized, each participating in two collaborative activities: one without a coordination technique and another using the Planning Poker method. Three of these teams were designated as control groups. The resulting dataset includes audio recordings of verbal interactions and non- verbal behaviour data, such as body posture, facial expressions, visual attention, and gestures, captured using MediaPipe, YOLOv8, and DeepSort. It also contains time-aligned automatic transcriptions generated with WhisperX, attention logs, mimicry labels, and surveys on perceived equity in interactions. This re- source aims to provide a comprehensive view of collaborative behaviour in agile contexts, supporting both qualitative analysis of interactions and the development of predictive models of group performance. The dataset explores how shared visual attention and behavioural synchrony influence team effectiveness and decision-making through this multimodal approach. This work contributes a unique dataset valuable to researchers across multiple fields of study.</div></div>\",\"PeriodicalId\":10973,\"journal\":{\"name\":\"Data in Brief\",\"volume\":\"61 \",\"pages\":\"Article 111828\"},\"PeriodicalIF\":1.4000,\"publicationDate\":\"2025-06-24\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Data in Brief\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S2352340925005554\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"MULTIDISCIPLINARY SCIENCES\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Data in Brief","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2352340925005554","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"MULTIDISCIPLINARY SCIENCES","Score":null,"Total":0}
A multimodal experimental dataset on agile software development team interactions
Studying collaborative dynamics in agile development teams requires multi- modal data that captures verbal and non-verbal communication. However, few experimental datasets provide this level of depth in real or simulated teamwork contexts. This article presents a multimodal dataset with experimental data collected during controlled sessions involving simulated agile development teams, each composed of four computer science students. A total of 19 groups (76 different participants) were organized, each participating in two collaborative activities: one without a coordination technique and another using the Planning Poker method. Three of these teams were designated as control groups. The resulting dataset includes audio recordings of verbal interactions and non- verbal behaviour data, such as body posture, facial expressions, visual attention, and gestures, captured using MediaPipe, YOLOv8, and DeepSort. It also contains time-aligned automatic transcriptions generated with WhisperX, attention logs, mimicry labels, and surveys on perceived equity in interactions. This re- source aims to provide a comprehensive view of collaborative behaviour in agile contexts, supporting both qualitative analysis of interactions and the development of predictive models of group performance. The dataset explores how shared visual attention and behavioural synchrony influence team effectiveness and decision-making through this multimodal approach. This work contributes a unique dataset valuable to researchers across multiple fields of study.
期刊介绍:
Data in Brief provides a way for researchers to easily share and reuse each other''s datasets by publishing data articles that: -Thoroughly describe your data, facilitating reproducibility. -Make your data, which is often buried in supplementary material, easier to find. -Increase traffic towards associated research articles and data, leading to more citations. -Open up doors for new collaborations. Because you never know what data will be useful to someone else, Data in Brief welcomes submissions that describe data from all research areas.