Chanchal Suman, Abhishek Singh, S. Saha, P. Bhattacharyya
{"title":"情绪感知性别预测多任务模型的发展","authors":"Chanchal Suman, Abhishek Singh, S. Saha, P. Bhattacharyya","doi":"10.1109/IJCNN55064.2022.9892404","DOIUrl":null,"url":null,"abstract":"With the rise of personalized online services, a huge opportunity for user profiling has developed. Gender plays a very important role for services that rely on information about a user's background. Although, due to anonymity and privacy, the gender information of a user is usually unavailable for other users. Social Networking sites have provided users with a lot of features to express their thoughts and emotions either using pictures or emojis or writing texts. Based on the idea that female and male users have some differences in their post and message contents, social media accounts can be analyzed using their textual posts for finding the user's gender. In this work, we explore different emotion-aided multi-modal gender prediction models. The basic intuition behind our proposed approach is to predict the gender of a user based on the emotional clues present in their multimodal posts, which includes texts as well as images. PAN 2018 dataset is enriched with emotion labels, for the experimentation. Different multi-tasking based architectures have been developed for gender prediction. Obtained results on the benchmark PAN-2018 dataset illustrate that the proposed multimodal emotion-aided system performs better than the single modal (with only text and only image) based models and the state of the art system too.","PeriodicalId":106974,"journal":{"name":"2022 International Joint Conference on Neural Networks (IJCNN)","volume":"49 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-07-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Development of Multi-task Models for Emotion-Aware Gender Prediction\",\"authors\":\"Chanchal Suman, Abhishek Singh, S. Saha, P. Bhattacharyya\",\"doi\":\"10.1109/IJCNN55064.2022.9892404\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"With the rise of personalized online services, a huge opportunity for user profiling has developed. Gender plays a very important role for services that rely on information about a user's background. Although, due to anonymity and privacy, the gender information of a user is usually unavailable for other users. Social Networking sites have provided users with a lot of features to express their thoughts and emotions either using pictures or emojis or writing texts. Based on the idea that female and male users have some differences in their post and message contents, social media accounts can be analyzed using their textual posts for finding the user's gender. In this work, we explore different emotion-aided multi-modal gender prediction models. The basic intuition behind our proposed approach is to predict the gender of a user based on the emotional clues present in their multimodal posts, which includes texts as well as images. PAN 2018 dataset is enriched with emotion labels, for the experimentation. Different multi-tasking based architectures have been developed for gender prediction. Obtained results on the benchmark PAN-2018 dataset illustrate that the proposed multimodal emotion-aided system performs better than the single modal (with only text and only image) based models and the state of the art system too.\",\"PeriodicalId\":106974,\"journal\":{\"name\":\"2022 International Joint Conference on Neural Networks (IJCNN)\",\"volume\":\"49 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-07-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 International Joint Conference on Neural Networks (IJCNN)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IJCNN55064.2022.9892404\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 International Joint Conference on Neural Networks (IJCNN)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IJCNN55064.2022.9892404","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Development of Multi-task Models for Emotion-Aware Gender Prediction
With the rise of personalized online services, a huge opportunity for user profiling has developed. Gender plays a very important role for services that rely on information about a user's background. Although, due to anonymity and privacy, the gender information of a user is usually unavailable for other users. Social Networking sites have provided users with a lot of features to express their thoughts and emotions either using pictures or emojis or writing texts. Based on the idea that female and male users have some differences in their post and message contents, social media accounts can be analyzed using their textual posts for finding the user's gender. In this work, we explore different emotion-aided multi-modal gender prediction models. The basic intuition behind our proposed approach is to predict the gender of a user based on the emotional clues present in their multimodal posts, which includes texts as well as images. PAN 2018 dataset is enriched with emotion labels, for the experimentation. Different multi-tasking based architectures have been developed for gender prediction. Obtained results on the benchmark PAN-2018 dataset illustrate that the proposed multimodal emotion-aided system performs better than the single modal (with only text and only image) based models and the state of the art system too.