{"title":"Hybrid Features Extraction for Adaptive Face Images Retrieval","authors":"A. Alti","doi":"10.4018/ijse.2020010102","DOIUrl":"https://doi.org/10.4018/ijse.2020010102","url":null,"abstract":"Existing methods of face emotion recognition have been limited in performance in terms of recognition accuracy and execution time. It is highly important to use efficient techniques for improving this performance. In this article, the authors present an automatic facial image retrieval combining the advantages of color normalization by texture estimators with the gradient vector. Starting from a query face image, an efficient algorithm for human face by hybrid feature extraction provides very interesting results.","PeriodicalId":272943,"journal":{"name":"Int. J. Synth. Emot.","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134213101","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Image Tampering Detection Using Convolutional Neural Network","authors":"S. Singhania, A. Arjun, Raina Singh","doi":"10.4018/ijse.2019010103","DOIUrl":"https://doi.org/10.4018/ijse.2019010103","url":null,"abstract":"Pictures are considered the most reliable form of media in journalism, research work, investigations, and intelligence reporting. With the rapid growth of ever-advancing technology and free applications on smartphones, sharing and transferring images is widely spread, which requires authentication and reliability. Copy-move forgery is considered a common image tampering type, where a part of the image is superimposed with another image. Such a tampering process occurs without leaving any obvious visual traces. In this study, an image tampering detection method was proposed by exploiting a convolutional neural network (CNN) for extracting the discriminative features from images and detects whether an image has been forged or not. The results established that the optimal number of epochs is 50 epochs using AlexNet-based CNN for classification-based tampering detection, with a 91% accuracy.","PeriodicalId":272943,"journal":{"name":"Int. J. Synth. Emot.","volume":"43 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132882876","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Joris H. Janssen, W. Ijsselsteijn, J. Westerink, Paul Tacken, Gert-Jan de Vries
{"title":"The Tell-Tale Heart: Perceived Emotional Intensity of Heartbeats","authors":"Joris H. Janssen, W. Ijsselsteijn, J. Westerink, Paul Tacken, Gert-Jan de Vries","doi":"10.4018/jse.2013010103","DOIUrl":"https://doi.org/10.4018/jse.2013010103","url":null,"abstract":"Heartbeats are strongly related to emotions, and people are known to interpret their own heartbeat as emotional information. To explore how people interpret other’s cardiac activity, the authors conducted four experiments. In the first experiment, they aurally presented ten different levels of heart rate to participants and compare emotional intensity ratings. In the second experiment, the authors compare the effects of nine levels of heart rate variability around 0.10 Hz and 0.30 Hz on emotional intensity ratings. In the third experiment, they combined manipulations of heart rate and heart rate variability to compare their effects. Finally, in the fourth experiment, they compare effects of heart rate to effects of angry versus neutral facial expressions, again on emotional intensity ratings. Overall, results show that people relate increases in heart rate to increases in emotional intensity. These effects were similar to effects of the facial expressions. This shows possibilities for using human interpretations of heart rate in communication applications.","PeriodicalId":272943,"journal":{"name":"Int. J. Synth. Emot.","volume":"46 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133158152","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Comparison of Several Acoustic Modeling Techniques for Speech Emotion Recognition","authors":"I. Trabelsi, M. Bouhlel","doi":"10.4018/IJSE.2016010105","DOIUrl":"https://doi.org/10.4018/IJSE.2016010105","url":null,"abstract":"Automatic Speech Emotion Recognition (SER) is a current research topic in the field of Human Computer Interaction (HCI) with a wide range of applications. The purpose of speech emotion recognition system is to automatically classify speaker's utterances into different emotional states such as disgust, boredom, sadness, neutral, and happiness. The speech samples in this paper are from the Berlin emotional database. Mel Frequency cepstrum coefficients (MFCC), Linear prediction coefficients (LPC), linear prediction cepstrum coefficients (LPCC), Perceptual Linear Prediction (PLP) and Relative Spectral Perceptual Linear Prediction (Rasta-PLP) features are used to characterize the emotional utterances using a combination between Gaussian mixture models (GMM) and Support Vector Machines (SVM) based on the Kullback-Leibler Divergence Kernel. In this study, the effect of feature type and its dimension are comparatively investigated. The best results are obtained with 12-coefficient MFCC. Utilizing the proposed features a recognition rate of 84% has been achieved which is close to the performance of humans on this database.","PeriodicalId":272943,"journal":{"name":"Int. J. Synth. Emot.","volume":"91 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127133549","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A Scene-Based Episodic Memory System for a Simulated Autonomous Creature","authors":"E. Castro, Ricardo Ribeiro Gudwin","doi":"10.4018/jse.2013010102","DOIUrl":"https://doi.org/10.4018/jse.2013010102","url":null,"abstract":"In this paper the authors present the development of a scene-based episodic memory module for the cognitive architecture controlling an autonomous virtual creature, in a simulated 3D environment. The scene-based episodic memory has the role of improving the creature’s navigation system, by evoking the objects to be considered in planning, according to episodic remembrance of earlier scenes testified by the creature where these objects were present in the past. They introduce the main background on human memory systems and episodic memory study, and provide the main ideas behind the experiment.","PeriodicalId":272943,"journal":{"name":"Int. J. Synth. Emot.","volume":"46 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123393224","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Syeda Erfana Zohora, A. M. Khan, A. K. Srivastava, G. Nguyen, N. Dey
{"title":"A Study of the State of the Art in Synthetic Emotional Intelligence in Affective Computing","authors":"Syeda Erfana Zohora, A. M. Khan, A. K. Srivastava, G. Nguyen, N. Dey","doi":"10.4018/IJSE.2016010101","DOIUrl":"https://doi.org/10.4018/IJSE.2016010101","url":null,"abstract":"In the last few decades there has been a tremendous amount of research on synthetic emotional intelligence related to affective computing that has significantly advanced from the technological point of view that refers to academic studies, systematic learning and developing knowledge and affective technology to a extensive area of real life time systems coupled with their applications. The objective of this paper is to present a general idea on the area of emotional intelligence in affective computing. The overview of the state of the art in emotional intelligence comprises of basic definitions and terminology, a study of current technological scenario. The paper also proposes research activities with a detailed study of ethical issues, challenges with importance on affective computing. Lastly, we present a broad area of applications such as interactive learning emotional systems, modeling emotional agents with an intention of employing these agents in human computer interactions as well as in education.","PeriodicalId":272943,"journal":{"name":"Int. J. Synth. Emot.","volume":"25 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121533615","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Emotion as a Significant Change in Neural Activity","authors":"Karla Parussel","doi":"10.4018/jse.2010101604","DOIUrl":"https://doi.org/10.4018/jse.2010101604","url":null,"abstract":"It is hypothesized here that two classes of emotions exist: driving and satisfying emotions. Driving emotions significantly increase the internal activity of the brain and result in the agent seeking to minimize its emotional state by performing actions that it would not otherwise do. Satisfying emotions decrease internal activity and encourage the agent to continue its current behavior to maintain its emotional state. It is theorized that neuromodulators act as simple yet high impact signals to either agitate or calm specific neural networks. This results in what we can define as either driving or satisfying emotions. The plausibility of this hypothesis is tested in this article using feed-forward networks of leaky integrate-and-fire neurons.","PeriodicalId":272943,"journal":{"name":"Int. J. Synth. Emot.","volume":"189 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115239603","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}