Image Emotion Computing

Sicheng Zhao
{"title":"Image Emotion Computing","authors":"Sicheng Zhao","doi":"10.1145/2964284.2971473","DOIUrl":null,"url":null,"abstract":"Images can convey rich semantics and induce strong emotions in viewers. My research aims to predict image emotions from different aspects with respect to two main challenges: affective gap and subjective evaluation. To bridge the affective gap, we extract emotion features based on principles-of-art to recognize image-centric dominant emotions. As the emotions that are induced in viewers by an image are highly subjective and different, we propose to predict user-centric personalized emotion perceptions for each viewer and image-centric emotion probability distribution for each image. To tackle the subjective evaluation issue, we set up a large scale image emotion dataset from Flickr, named Image-Emotion-Social-Net, on both dimensional and categorical emotion representations with over 1 million images and about 8,000 users. Different types of factors may influence personalized image emotion perceptions, including visual content, social context, temporal evolution and location influence. We make an initial attempt to jointly combine them by the proposed rolling multi-task hypergraph learning. Both discrete and continuous emotion distributions are modelled via shared sparse learning. Further, several potential applications based on image emotions are designed and implemented.","PeriodicalId":140670,"journal":{"name":"Proceedings of the 24th ACM international conference on Multimedia","volume":"77 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 24th ACM international conference on Multimedia","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2964284.2971473","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4

Abstract

Images can convey rich semantics and induce strong emotions in viewers. My research aims to predict image emotions from different aspects with respect to two main challenges: affective gap and subjective evaluation. To bridge the affective gap, we extract emotion features based on principles-of-art to recognize image-centric dominant emotions. As the emotions that are induced in viewers by an image are highly subjective and different, we propose to predict user-centric personalized emotion perceptions for each viewer and image-centric emotion probability distribution for each image. To tackle the subjective evaluation issue, we set up a large scale image emotion dataset from Flickr, named Image-Emotion-Social-Net, on both dimensional and categorical emotion representations with over 1 million images and about 8,000 users. Different types of factors may influence personalized image emotion perceptions, including visual content, social context, temporal evolution and location influence. We make an initial attempt to jointly combine them by the proposed rolling multi-task hypergraph learning. Both discrete and continuous emotion distributions are modelled via shared sparse learning. Further, several potential applications based on image emotions are designed and implemented.
图像情感计算
图像可以传达丰富的语义,引起观众强烈的情感。我的研究旨在从不同方面预测形象情绪,主要涉及两个挑战:情感差距和主观评价。为了弥合情感差距,我们基于艺术原理提取情感特征,以识别以图像为中心的主导情感。由于图像在观众中引起的情绪是高度主观和不同的,我们建议预测每个观众以用户为中心的个性化情绪感知,以及每个图像以图像为中心的情绪概率分布。为了解决主观评价问题,我们建立了一个来自Flickr的大规模图像情感数据集,名为image - emotion - social - net,包含超过100万张图像和大约8000名用户的维度和分类情感表示。不同类型的因素可能影响个性化的图像情感感知,包括视觉内容、社会背景、时间演变和位置影响。我们通过提出滚动多任务超图学习,初步尝试将两者结合起来。离散和连续的情绪分布通过共享稀疏学习建模。此外,设计并实现了基于图像情感的几种潜在应用。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信