Nataša Kovač, Kruna Ratković, Hojjatollah Farahani, Peter Watson
{"title":"内部羞耻感的机器学习回归模型。","authors":"Nataša Kovač, Kruna Ratković, Hojjatollah Farahani, Peter Watson","doi":"10.1016/j.actpsy.2025.105721","DOIUrl":null,"url":null,"abstract":"<p><p>This study aims to predict Internal Shame (IS) based on childhood trauma, social emotional competence, cognitive flexibility, distress tolerance and alexithymia in an Iranian sample. The regression results suggested that distress tolerance was the most significant predictor, whereas cognitive flexibility had the least impact. We initially tested nine machine learning regression techniques (Multi-Layer Perceptron, AdaBoost, Support Vector Regression, Artificial Neural Network, Decision Tree, Random Forest, Gradient Boosting, Stochastic Gradient Boosting, and Extreme Gradient Boosting). Based on performance evaluation, we retained five models (Decision Tree, Random Forest, Gradient Boosting, Stochastic Gradient Boosting, and XGBoost) for detailed analysis of IS. The findings indicate that the XGBoost regression model was superior in performance compared to the other applied methods.</p>","PeriodicalId":7141,"journal":{"name":"Acta Psychologica","volume":"260 ","pages":"105721"},"PeriodicalIF":2.7000,"publicationDate":"2025-10-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Machine learning regression models for internal shame.\",\"authors\":\"Nataša Kovač, Kruna Ratković, Hojjatollah Farahani, Peter Watson\",\"doi\":\"10.1016/j.actpsy.2025.105721\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>This study aims to predict Internal Shame (IS) based on childhood trauma, social emotional competence, cognitive flexibility, distress tolerance and alexithymia in an Iranian sample. The regression results suggested that distress tolerance was the most significant predictor, whereas cognitive flexibility had the least impact. We initially tested nine machine learning regression techniques (Multi-Layer Perceptron, AdaBoost, Support Vector Regression, Artificial Neural Network, Decision Tree, Random Forest, Gradient Boosting, Stochastic Gradient Boosting, and Extreme Gradient Boosting). Based on performance evaluation, we retained five models (Decision Tree, Random Forest, Gradient Boosting, Stochastic Gradient Boosting, and XGBoost) for detailed analysis of IS. The findings indicate that the XGBoost regression model was superior in performance compared to the other applied methods.</p>\",\"PeriodicalId\":7141,\"journal\":{\"name\":\"Acta Psychologica\",\"volume\":\"260 \",\"pages\":\"105721\"},\"PeriodicalIF\":2.7000,\"publicationDate\":\"2025-10-07\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Acta Psychologica\",\"FirstCategoryId\":\"102\",\"ListUrlMain\":\"https://doi.org/10.1016/j.actpsy.2025.105721\",\"RegionNum\":4,\"RegionCategory\":\"心理学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"PSYCHOLOGY, EXPERIMENTAL\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Acta Psychologica","FirstCategoryId":"102","ListUrlMain":"https://doi.org/10.1016/j.actpsy.2025.105721","RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"PSYCHOLOGY, EXPERIMENTAL","Score":null,"Total":0}
Machine learning regression models for internal shame.
This study aims to predict Internal Shame (IS) based on childhood trauma, social emotional competence, cognitive flexibility, distress tolerance and alexithymia in an Iranian sample. The regression results suggested that distress tolerance was the most significant predictor, whereas cognitive flexibility had the least impact. We initially tested nine machine learning regression techniques (Multi-Layer Perceptron, AdaBoost, Support Vector Regression, Artificial Neural Network, Decision Tree, Random Forest, Gradient Boosting, Stochastic Gradient Boosting, and Extreme Gradient Boosting). Based on performance evaluation, we retained five models (Decision Tree, Random Forest, Gradient Boosting, Stochastic Gradient Boosting, and XGBoost) for detailed analysis of IS. The findings indicate that the XGBoost regression model was superior in performance compared to the other applied methods.
期刊介绍:
Acta Psychologica publishes original articles and extended reviews on selected books in any area of experimental psychology. The focus of the Journal is on empirical studies and evaluative review articles that increase the theoretical understanding of human capabilities.