{"title":"面向建筑教学传感器数据分析学习的块编程界面认知维度和注意维度检测","authors":"Mohammad Khalid , Abiola Akanmu , Ibukun Awolusi , Homero Murzi","doi":"10.1016/j.ijhcs.2025.103626","DOIUrl":null,"url":null,"abstract":"<div><div>The increasing adoption of sensing technologies in the construction industry generates vast amounts of raw data, requiring analytics skills for effective extraction, analysis, and communication of actionable insights. To address this, ActionSens, a block-based programming interface, was developed to equip undergraduate construction engineering students with domain-specific sensor data analytics skills. However, efficient user interaction with such tools requires integrating intelligent systems capable of detecting users’ attention and cognitive states to provide context-specific and tailored support. This study leveraged eye-tracking data from construction students during the usability evaluation of ActionSens to explore machine learning models for classifying areas of interest and interaction difficulties. For visual detection, key interface elements were defined as areas of interest, serving as ground truth, while interaction difficulty was labeled based on participant feedback for reported challenges. The Ensemble model demonstrated the highest performance, achieving 88.3% accuracy in classifying areas of interest with raw data, and 82.9% for classifying interaction difficulties using oversampling techniques. Results show that gaze position and pupil diameter were the most reliable predictors for classifying areas of interest and detecting interaction difficulties. This study pioneers the integration of machine learning and eye-tracking with block-based programming interfaces in construction education. It also reinforces the Aptitude-Treatment Interaction theory by demonstrating how personalized support can be adapted based on individual cognitive aptitudes to enhance learning outcomes. These findings further contribute to the development of adaptive learning environments that can detect specific user aptitudes and provide context-specific guidance, enabling students to acquire technical skills more effectively.</div></div>","PeriodicalId":54955,"journal":{"name":"International Journal of Human-Computer Studies","volume":"205 ","pages":"Article 103626"},"PeriodicalIF":5.1000,"publicationDate":"2025-09-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Detection of cognitive and attention dimensions in block programming interface for learning sensor data analytics in construction education\",\"authors\":\"Mohammad Khalid , Abiola Akanmu , Ibukun Awolusi , Homero Murzi\",\"doi\":\"10.1016/j.ijhcs.2025.103626\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>The increasing adoption of sensing technologies in the construction industry generates vast amounts of raw data, requiring analytics skills for effective extraction, analysis, and communication of actionable insights. To address this, ActionSens, a block-based programming interface, was developed to equip undergraduate construction engineering students with domain-specific sensor data analytics skills. However, efficient user interaction with such tools requires integrating intelligent systems capable of detecting users’ attention and cognitive states to provide context-specific and tailored support. This study leveraged eye-tracking data from construction students during the usability evaluation of ActionSens to explore machine learning models for classifying areas of interest and interaction difficulties. For visual detection, key interface elements were defined as areas of interest, serving as ground truth, while interaction difficulty was labeled based on participant feedback for reported challenges. The Ensemble model demonstrated the highest performance, achieving 88.3% accuracy in classifying areas of interest with raw data, and 82.9% for classifying interaction difficulties using oversampling techniques. Results show that gaze position and pupil diameter were the most reliable predictors for classifying areas of interest and detecting interaction difficulties. This study pioneers the integration of machine learning and eye-tracking with block-based programming interfaces in construction education. It also reinforces the Aptitude-Treatment Interaction theory by demonstrating how personalized support can be adapted based on individual cognitive aptitudes to enhance learning outcomes. These findings further contribute to the development of adaptive learning environments that can detect specific user aptitudes and provide context-specific guidance, enabling students to acquire technical skills more effectively.</div></div>\",\"PeriodicalId\":54955,\"journal\":{\"name\":\"International Journal of Human-Computer Studies\",\"volume\":\"205 \",\"pages\":\"Article 103626\"},\"PeriodicalIF\":5.1000,\"publicationDate\":\"2025-09-10\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International Journal of Human-Computer Studies\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S1071581925001831\",\"RegionNum\":2,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, CYBERNETICS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Human-Computer Studies","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1071581925001831","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, CYBERNETICS","Score":null,"Total":0}
Detection of cognitive and attention dimensions in block programming interface for learning sensor data analytics in construction education
The increasing adoption of sensing technologies in the construction industry generates vast amounts of raw data, requiring analytics skills for effective extraction, analysis, and communication of actionable insights. To address this, ActionSens, a block-based programming interface, was developed to equip undergraduate construction engineering students with domain-specific sensor data analytics skills. However, efficient user interaction with such tools requires integrating intelligent systems capable of detecting users’ attention and cognitive states to provide context-specific and tailored support. This study leveraged eye-tracking data from construction students during the usability evaluation of ActionSens to explore machine learning models for classifying areas of interest and interaction difficulties. For visual detection, key interface elements were defined as areas of interest, serving as ground truth, while interaction difficulty was labeled based on participant feedback for reported challenges. The Ensemble model demonstrated the highest performance, achieving 88.3% accuracy in classifying areas of interest with raw data, and 82.9% for classifying interaction difficulties using oversampling techniques. Results show that gaze position and pupil diameter were the most reliable predictors for classifying areas of interest and detecting interaction difficulties. This study pioneers the integration of machine learning and eye-tracking with block-based programming interfaces in construction education. It also reinforces the Aptitude-Treatment Interaction theory by demonstrating how personalized support can be adapted based on individual cognitive aptitudes to enhance learning outcomes. These findings further contribute to the development of adaptive learning environments that can detect specific user aptitudes and provide context-specific guidance, enabling students to acquire technical skills more effectively.
期刊介绍:
The International Journal of Human-Computer Studies publishes original research over the whole spectrum of work relevant to the theory and practice of innovative interactive systems. The journal is inherently interdisciplinary, covering research in computing, artificial intelligence, psychology, linguistics, communication, design, engineering, and social organization, which is relevant to the design, analysis, evaluation and application of innovative interactive systems. Papers at the boundaries of these disciplines are especially welcome, as it is our view that interdisciplinary approaches are needed for producing theoretical insights in this complex area and for effective deployment of innovative technologies in concrete user communities.
Research areas relevant to the journal include, but are not limited to:
• Innovative interaction techniques
• Multimodal interaction
• Speech interaction
• Graphic interaction
• Natural language interaction
• Interaction in mobile and embedded systems
• Interface design and evaluation methodologies
• Design and evaluation of innovative interactive systems
• User interface prototyping and management systems
• Ubiquitous computing
• Wearable computers
• Pervasive computing
• Affective computing
• Empirical studies of user behaviour
• Empirical studies of programming and software engineering
• Computer supported cooperative work
• Computer mediated communication
• Virtual reality
• Mixed and augmented Reality
• Intelligent user interfaces
• Presence
...