{"title":"基于模式识别和强化学习的大学生心理健康分析","authors":"Pengrui Zhi","doi":"10.1002/itl2.453","DOIUrl":null,"url":null,"abstract":"<p>Mental health education for college students is an important part of ideological and political work in colleges and universities, which is related to the physical and mental health and long-term development of college students. In particular, the nature of the work of counselors endows them with unique advantages in psychological education. Based on the insufficiency of unimodal data features, we propose a method for analyzing the mental health of college students based on multimodal social-affective classification. At the same time, we design a multimodal data fusion model, which takes text data as the main body and uses text and images to jointly classify the main body's emotion. First, we use the Bidirectional Encoder Representations from Transformers (BERT) pre-training model to extract text features and obtain corresponding text vectors. Second, we utilize the Visual Geometry Group (VGG16) model trained on the ImageNet dataset as a pre-training model to obtain image features. Third, we combine the modality features extracted by the two models to complete the final mental health classification task. Experimental results show that our proposed multimodal feature fusion model exhibits good performance on both constructed and public datasets.</p>","PeriodicalId":100725,"journal":{"name":"Internet Technology Letters","volume":"7 6","pages":""},"PeriodicalIF":0.9000,"publicationDate":"2023-06-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Mental health analysis for college students based on pattern recognition and reinforcement learning\",\"authors\":\"Pengrui Zhi\",\"doi\":\"10.1002/itl2.453\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>Mental health education for college students is an important part of ideological and political work in colleges and universities, which is related to the physical and mental health and long-term development of college students. In particular, the nature of the work of counselors endows them with unique advantages in psychological education. Based on the insufficiency of unimodal data features, we propose a method for analyzing the mental health of college students based on multimodal social-affective classification. At the same time, we design a multimodal data fusion model, which takes text data as the main body and uses text and images to jointly classify the main body's emotion. First, we use the Bidirectional Encoder Representations from Transformers (BERT) pre-training model to extract text features and obtain corresponding text vectors. Second, we utilize the Visual Geometry Group (VGG16) model trained on the ImageNet dataset as a pre-training model to obtain image features. Third, we combine the modality features extracted by the two models to complete the final mental health classification task. Experimental results show that our proposed multimodal feature fusion model exhibits good performance on both constructed and public datasets.</p>\",\"PeriodicalId\":100725,\"journal\":{\"name\":\"Internet Technology Letters\",\"volume\":\"7 6\",\"pages\":\"\"},\"PeriodicalIF\":0.9000,\"publicationDate\":\"2023-06-26\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Internet Technology Letters\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://onlinelibrary.wiley.com/doi/10.1002/itl2.453\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q4\",\"JCRName\":\"TELECOMMUNICATIONS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Internet Technology Letters","FirstCategoryId":"1085","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1002/itl2.453","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"TELECOMMUNICATIONS","Score":null,"Total":0}
Mental health analysis for college students based on pattern recognition and reinforcement learning
Mental health education for college students is an important part of ideological and political work in colleges and universities, which is related to the physical and mental health and long-term development of college students. In particular, the nature of the work of counselors endows them with unique advantages in psychological education. Based on the insufficiency of unimodal data features, we propose a method for analyzing the mental health of college students based on multimodal social-affective classification. At the same time, we design a multimodal data fusion model, which takes text data as the main body and uses text and images to jointly classify the main body's emotion. First, we use the Bidirectional Encoder Representations from Transformers (BERT) pre-training model to extract text features and obtain corresponding text vectors. Second, we utilize the Visual Geometry Group (VGG16) model trained on the ImageNet dataset as a pre-training model to obtain image features. Third, we combine the modality features extracted by the two models to complete the final mental health classification task. Experimental results show that our proposed multimodal feature fusion model exhibits good performance on both constructed and public datasets.