利用街景图像上的眼动数据感知城市

IF 2.1 3区 地球科学 Q2 GEOGRAPHY
Nai Yang, Zhitao Deng, Fangtai Hu, Yi Chao, Lin Wan, Qingfeng Guan, Zhiwei Wei
{"title":"利用街景图像上的眼动数据感知城市","authors":"Nai Yang, Zhitao Deng, Fangtai Hu, Yi Chao, Lin Wan, Qingfeng Guan, Zhiwei Wei","doi":"10.1111/tgis.13172","DOIUrl":null,"url":null,"abstract":"Understanding the spatial distribution patterns of urban perception and analyzing the correlation between human emotional perception and street composition elements are important for accurately understanding how people interact with the urban environment, urban planning, and urban management. Previous studies on urban perception using street view data have not fully considered the actual level of attention to different visual elements when browsing street view images. In this article, we use eye tracking technology to collect eye movement data and subjective perception evaluation data when people browse street view images, and analyze the correlation between the time to first fixation, duration of first fixation, and fixation frequency of different visual elements and the six perceptual outcomes of wealthy, safe, lively, beautiful, boring, and depressing. Furthermore, this article integrates eye movement data with street view semantic data and introduces a novel method for predicting urban perception using a machine learning algorithm. The proposed method outperforms a comparative model that solely relies on semantic data, exhibiting higher accuracy in perception prediction. Additionally, the study presents a perceptual mapping of the prediction results, providing a visual representation of the predicted urban perception outcomes. As vision is the primary perceptual channel, this study achieves a more objective and scientifically reliable urban perception, which is of reference value for the study of physical and mental health due to the urban physical environment.","PeriodicalId":47842,"journal":{"name":"Transactions in GIS","volume":null,"pages":null},"PeriodicalIF":2.1000,"publicationDate":"2024-05-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Urban perception by using eye movement data on street view images\",\"authors\":\"Nai Yang, Zhitao Deng, Fangtai Hu, Yi Chao, Lin Wan, Qingfeng Guan, Zhiwei Wei\",\"doi\":\"10.1111/tgis.13172\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Understanding the spatial distribution patterns of urban perception and analyzing the correlation between human emotional perception and street composition elements are important for accurately understanding how people interact with the urban environment, urban planning, and urban management. Previous studies on urban perception using street view data have not fully considered the actual level of attention to different visual elements when browsing street view images. In this article, we use eye tracking technology to collect eye movement data and subjective perception evaluation data when people browse street view images, and analyze the correlation between the time to first fixation, duration of first fixation, and fixation frequency of different visual elements and the six perceptual outcomes of wealthy, safe, lively, beautiful, boring, and depressing. Furthermore, this article integrates eye movement data with street view semantic data and introduces a novel method for predicting urban perception using a machine learning algorithm. The proposed method outperforms a comparative model that solely relies on semantic data, exhibiting higher accuracy in perception prediction. Additionally, the study presents a perceptual mapping of the prediction results, providing a visual representation of the predicted urban perception outcomes. As vision is the primary perceptual channel, this study achieves a more objective and scientifically reliable urban perception, which is of reference value for the study of physical and mental health due to the urban physical environment.\",\"PeriodicalId\":47842,\"journal\":{\"name\":\"Transactions in GIS\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":2.1000,\"publicationDate\":\"2024-05-06\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Transactions in GIS\",\"FirstCategoryId\":\"89\",\"ListUrlMain\":\"https://doi.org/10.1111/tgis.13172\",\"RegionNum\":3,\"RegionCategory\":\"地球科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"GEOGRAPHY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Transactions in GIS","FirstCategoryId":"89","ListUrlMain":"https://doi.org/10.1111/tgis.13172","RegionNum":3,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"GEOGRAPHY","Score":null,"Total":0}
引用次数: 0

摘要

了解城市感知的空间分布模式,分析人类情感感知与街道构成要素之间的相关性,对于准确理解人与城市环境的互动方式、城市规划和城市管理非常重要。以往利用街景数据进行的城市感知研究并未充分考虑人们在浏览街景图像时对不同视觉元素的实际关注程度。本文利用眼动跟踪技术收集了人们浏览街景图像时的眼动数据和主观感知评价数据,分析了不同视觉元素的首次固定时间、首次固定持续时间和固定频率与富裕、安全、热闹、美丽、无聊和压抑六种感知结果之间的相关性。此外,本文还整合了眼动数据和街景语义数据,并介绍了一种利用机器学习算法预测城市感知的新方法。所提出的方法优于仅依赖语义数据的比较模型,在感知预测方面表现出更高的准确性。此外,该研究还提出了预测结果的感知映射,为预测的城市感知结果提供了视觉呈现。由于视觉是主要的感知渠道,本研究实现了更客观、更科学可靠的城市感知,对研究城市物理环境导致的身心健康具有参考价值。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Urban perception by using eye movement data on street view images
Understanding the spatial distribution patterns of urban perception and analyzing the correlation between human emotional perception and street composition elements are important for accurately understanding how people interact with the urban environment, urban planning, and urban management. Previous studies on urban perception using street view data have not fully considered the actual level of attention to different visual elements when browsing street view images. In this article, we use eye tracking technology to collect eye movement data and subjective perception evaluation data when people browse street view images, and analyze the correlation between the time to first fixation, duration of first fixation, and fixation frequency of different visual elements and the six perceptual outcomes of wealthy, safe, lively, beautiful, boring, and depressing. Furthermore, this article integrates eye movement data with street view semantic data and introduces a novel method for predicting urban perception using a machine learning algorithm. The proposed method outperforms a comparative model that solely relies on semantic data, exhibiting higher accuracy in perception prediction. Additionally, the study presents a perceptual mapping of the prediction results, providing a visual representation of the predicted urban perception outcomes. As vision is the primary perceptual channel, this study achieves a more objective and scientifically reliable urban perception, which is of reference value for the study of physical and mental health due to the urban physical environment.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Transactions in GIS
Transactions in GIS GEOGRAPHY-
CiteScore
4.60
自引率
8.30%
发文量
116
期刊介绍: Transactions in GIS is an international journal which provides a forum for high quality, original research articles, review articles, short notes and book reviews that focus on: - practical and theoretical issues influencing the development of GIS - the collection, analysis, modelling, interpretation and display of spatial data within GIS - the connections between GIS and related technologies - new GIS applications which help to solve problems affecting the natural or built environments, or business
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信