Emotion Detection from Physiological Markers Using Machine Learning

Mirjana Kocaleva Vitanova, Aleksandra Stojanova Ilievska, Natasa Koceska, Saso Koceski

Abstract


Human emotion recognition in computer and robotic systems is crucial because it allows these systems to respond to users in a way that feels natural and supportive. By interpreting emotional cues, these systems can adjust their interactions—offering empathy, encouragement, or even assistance during times of distress—enhancing user satisfaction and making technology more accessible and engaging. Emotion recognition methods include analyzing facial expressions, vocal tone, and physiological signals, with the latter being especially effective because physiological data offers objective, real-time insights that are less susceptible to misinterpretation or masking than visible expressions. In this paper, we conducted an experiment for emotion recognition from physiological markers using machine learning algorithms. Each of the participants involved in the experiment was exposed to video stimuli designed to evoke specific emotions. Using physiological markers such as heart rate (HR) and respiratory rate (RR), seven emotions—anger, sadness, fear, amusement, neutrality, surprise, and happiness/joy—were analyzed. Three classification methods Random-forest, SVM and J48 were used. According to the results from the experimental evaluation, the highest accuracy for classifying emotions, based on both HR and RR across all emotions, was obtained with J48 algorithm. Specifically, the most clearly expressed and distinguishable emotions through RR were fear and sadness, with classification accuracies of 96.43% and 92.86%, respectively. Sadness was also the most accurately classified emotion through HR, with an accuracy of 85.71%. Gender differences were noted, with females reacting more to sadness and males to happiness.


Full Text:

PDF

References


Retkoceri, U. (2022). Remembering emotions. Biology & Philosophy, 37(1), 5.

Hamby, A. and Jones, N., 2022. The effect of affect: An appraisal theory perspective on emotional engagement in narrative persuasion. Journal of Advertising, 51(1), pp.116-131.

Doan, T., Ong, D.C. and Wu, Y., 2024. Emotion understanding as third-person appraisals: Integrating appraisal theories with developmental theories of emotion. Psychological Review.

Koceski, S. and Petrevska, B., 2012. Empirical evidence of contribution to e-tourism by application of personalized tourism recommendation system. Scientific Annals of the “Alexandru Ioan Cuza” University of Iasi–Economic Sciences Section, (1), pp.371-383.

Trajkovik, V., Vlahu-Gjorgievska, E., Koceski, S. and Kulev, I., 2015. General assisted living system architecture model. In Mobile Networks and Management: 6th International Conference, MONAMI 2014, Würzburg, Germany, September 22-26, 2014, Revised Selected Papers 6 (pp. 329-343). Springer International Publishing.

Stojanov, D. and Koceski, S., 2014, September. Topological MRI prostate segmentation method. In 2014 Federated Conference on Computer Science and Information Systems (pp. 219-225). IEEE.

Koceski, S. and Koceska, N., 2016. Evaluation of an assistive telepresence robot for elderly healthcare. Journal of medical systems, 40, pp.1-7.

Stojanov, D., Mileva, A. and Koceski, S., 2012. A new, space-efficient local pairwise alignment methodology. Advanced Studies in Biology, 4(2), pp.85-93.

Koceski, S. and Koceska, N., 2013. Challenges of videoconferencing distance education-a student perspective. International Journal of Information, Business and Management, 5(2), p.274.

Koceski, S., Koceska, N. and Kocev, I., 2012. Design and evaluation of cell phone pointing interface for robot control. International Journal of Advanced Robotic Systems, 9(4), p.135.

Koceski, S., Panov, S., Koceska, N., Zobel, P.B. and Durante, F., 2014. A novel quad harmony search algorithm for grid-based path finding. International Journal of Advanced Robotic Systems, 11(9), p.144.

Koceska, N., Koceski, S., Durante, F., Zobel, P.B. and Raparelli, T., 2013. Control architecture of a 10 DOF lower limbs exoskeleton for gait rehabilitation. International Journal of Advanced Robotic Systems, 10(1), p.68.

Serafimov, K., Angelkov, D., Koceska, N. and Koceski, S., 2012, June. Using mobile-phone accelerometer for gestural control of soccer robots. In 2012 Mediterranean Conference on Embedded Computing (MECO) (pp. 140-143). IEEE.

Koceska, N. and Koceski, S., 2014. Financial-Economic Time Series Modeling and Prediction Techniques–Review. Journal of Applied Economics and Business, 2(4), pp.28-33.

Van Doorn, E.A., Van Kleef, G.A. and Van der Pligt, J., 2015. Deriving meaning from others’ emotions: attribution, appraisal, and the use of emotions as social information. Frontiers in Psychology, 6, p.1077.

Valderas, M.T., Bolea, J., Laguna, P., Vallverdú, M. and Bailón, R., 2015, August. Human emotion recognition using heart rate variability analysis with spectral bands based on respiration. In 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC) (pp. 6134-6137). IEEE.

Kocaleva, M., Stojanova, A. and Koceska, N., 2017. Using physiological parameters for evaluating User Experience. ITRO 2017, 22 June 2017, Zrenjanin, Serbia.

Jain, M., Narayan, S., Balaji, P., Bhowmick, A. and Muthu, R.K., 2020. Speech emotion recognition using support vector machine. arXiv preprint arXiv:2002.07590.

Chatterjee, R., Mazumdar, S., Sherratt, R.S., Halder, R., Maitra, T. and Giri, D., 2021. Real-time speech emotion analysis for smart home assistants. IEEE Transactions on Consumer Electronics, 67(1), pp.68-76.

Wani, T.M., Gunawan, T.S., Qadri, S.A.A., Kartiwi, M. and Ambikairajah, E., 2021. A comprehensive review of speech emotion recognition systems. IEEE access, 9, pp.47795-47814.

Saadon, J.R., Yang, F., Burgert, R., Mohammad, S., Gammel, T., Sepe, M., Rafailovich, M., Mikell, C.B., Polak, P. and Mofakham, S., 2023. Real-time emotion detection by quantitative facial motion analysis. Plos one, 18(3), p.e0282730.

Khan, N., Singh, A.V. and Agrawal, R., 2023. Enhancing feature extraction technique through spatial deep learning model for facial emotion detection. Annals of Emerging Technologies in Computing (AETiC), 7(2), pp.9-22.

Sarvakar, K., Senkamalavalli, R., Raghavendra, S., Kumar, J.S., Manjunath, R. and Jaiswal, S., 2023. Facial emotion recognition using convolutional neural networks. Materials Today: Proceedings, 80, pp.3560-3564.

Sathyamoorthy, B., Snehalatha, U. and Rajalakshmi, T., 2023. Facial emotion detection of thermal and digital images based on machine learning techniques. Biomedical Engineering: Applications, Basis and Communications, 35(01), p.2250052.

Naga, P., Marri, S.D. and Borreo, R., 2023. Facial emotion recognition methods, datasets and technologies: A literature survey. Materials Today: Proceedings, 80, pp.2824-2828.

Xu, S., Fang, J., Hu, X., Ngai, E., Wang, W., Guo, Y. and Leung, V.C., 2022. Emotion recognition from gait analyses: Current research and future directions. IEEE Transactions on Computational Social Systems, 11(1), pp.363-377.

Leong, S.C., Tang, Y.M., Lai, C.H. and Lee, C.K.M., 2023. Facial expression and body gesture emotion recognition: A systematic review on the use of visual data in affective computing. Computer Science Review, 48, p.100545.

Picard, R.W. and Healey, J., 1997. Affective wearables. Personal technologies, 1, pp.231-240.

Saganowski, S., Perz, B., Polak, A.G. and Kazienko, P., 2022. Emotion recognition for everyday life using physiological signals from wearables: A systematic literature review. IEEE Transactions on Affective Computing, 14(3), pp.1876-1897.

Drachen, A., Nacke, L.E., Yannakakis, G. and Pedersen, A.L., 2010, July. Correlation between heart rate, electrodermal activity and player experience in first-person shooter games. In Proceedings of the 5th ACM SIGGRAPH Symposium on Video Games (pp. 49-54).

Bruun, A., Law, E.L.C., Heintz, M. and Eriksen, P.S., 2016, October. Asserting real-time emotions through cued-recall: is it valid?. In Proceedings of the 9th Nordic Conference on Human-Computer Interaction (pp. 1-10).

Geslin, E., Jégou, L. and Beaudoin, D., 2016. How color properties can be used to elicit emotions in video games. International Journal of Computer Games Technology, 2016(1), p.5182768.

Liapis, A., Karousos, N., Katsanos, C. and Xenos, M., 2014. Evaluating user’s emotional experience in HCI: the PhysiOBS approach. In Human-Computer Interaction. Advanced Interaction Modalities and Techniques: 16th International Conference, HCI International 2014, Heraklion, Crete, Greece, June 22-27, 2014, Proceedings, Part II 16 (pp. 758-767). Springer International Publishing.

Bentley, T., Johnston, L. and von Baggo, K., 2005, November. Evaluation using cued-recall debrief to elicit information about a user's affective experiences. In Proceedings of the 17th Australia Conference on Computer-Human Interaction: Citizens Online: Considerations for Today and the Future (pp. 1-10).

Zimmermann, P.G., 2008. Beyond usability: measuring aspects of user experience (Doctoral dissertation, ETH Zurich). Eidgenössische Technische Hochschule ETH Zürich, 10.

Valenza, G., Lanata, A. and Scilingo, E.P., 2012. Oscillations of heart rate and respiration synchronize during affective visual stimulation. IEEE Transactions on Information Technology in Biomedicine, 16(4), pp.683-690.

Petrantonakis P.C., Hadjileontiadis L.J. Emotion Recognition from Brain Signals Using Hybrid Adaptive Filtering and Higher Order Crossings Analysis. IEEE Trans. Affect. Comput. 2010;1:81–97.

Valenza G., Lanata A., Scilingo E.P. The Role of Nonlinear Dynamics in Affective Valence and Arousal Recognition. IEEE Trans. Affect. Comput. 2012;3:237–249. doi: 10.1109/T-AFFC.2011.30.

Basu S., Jana N., Bag A., Mahadevappa M., Mukherjee J., Kumar S., Guha R. Emotion recognition based on physiological signals using valence-arousal model; Proceedings of the Third International Conference on Image Information Processing; Waknaghat, India. 21–24 December 2015; pp. 50–55.

Lan Z., Sourina O., Wang L., Liu Y. Real-time EEG-based emotion monitoring using stable features. Vis. Comput. 2016;32:347–358. doi: 10.1007/s00371-015-1183-y.

Wong W.M., Tan A.W., Loo C.K., Liew W.S. PSO optimization of synergetic neural classifier for multichannel emotion recognition; Proceedings of the 2010 Second World Congress on Nature and Biologically Inspired Computing (NaBIC); Kitakyushu, Japan. 15–17 December 2010; pp. 316–321.

Zied G., Lachiri Z., Maaoui C. Emotion recognition from physiological signals using fusion of wavelet based features; Proceedings of the International Conference on Modelling, Identification and Control; Sousse, Tunisia. 18–20 December 2015.

Lin Y.P., Wang C.H., Jung T.P., Wu T.L., Jeng S.K., Duann J.R., Chen J.H. EEG-Based Emotion Recognition in Music Listening. IEEE Trans. Bio-Med. Eng. 2010;57:1798–1806. doi: 10.1109/TBME.2010.2048568.

Hsu, Y.L.; Wang, J.S.; Chiang, W.C.; Hung, C.H. Automatic ECG-based emotion recognition in music listening. IEEE Trans. Affect. Comput. 2017, 11, 85–99.

Agrafioti F., Hatzinakos D., Anderson A.K. ECG Pattern Analysis for Emotion Detection. IEEE Trans. Affect. Comput. 2012;3:102–115. doi: 10.1109/T-AFFC.2011.28.

Bassano, C., Ballestin, G., Ceccaldi, E., Larradet, F., Mancini, M., Volta, E., et al. (2019). A VR Game-based System for Multimodal Emotion Data Collection, 12th annual ACM SIGGRAPH conference on Motion, Interaction and Games 2019 (MIG 2019) (Newcastle Upon Tyne). doi: 10.1145/3359566.3364695

Valenzi, S., Islam, T., Jurica, P., & Cichocki, A. (2014). Individual classification of emotions using EEG. Journal of Biomedical Science and Engineering, 2014.

Mirmohamadsadeghi, L., Yazdani, A., & Vesin, J. M. (2016, September). Using cardio-respiratory signals to recognize emotions elicited by watching music video clips. In 2016 IEEE 18th International Workshop on Multimedia Signal Processing (MMSP) (pp. 1-5). IEEE.

Shin D., Shin D., Shin D. Development of emotion recognition interface using complex EEG/ECG bio-signal for interactive contents. Multimed. Tools Appl. 2017;76:11449–11470. doi: 10.1007/s11042-016-4203-7.

Kolodyazhniy V., Kreibig S.D., Gross J.J., Roth W.T., Wilhelm F.H. An affective computing approach to physiological emotion specificity: Toward subject-independent and stimulus-independent classification of film-induced emotions. Psychophysiology. 2011;48:908–922. doi: 10.1111/j.1469-8986.2010.01170.x

Wu C.K., Chung P.C., Wang C.J. Representative Segment-Based Emotion Analysis and Classification with Automatic Respiration Signal Segmentation. IEEE Trans. Affect. Comput. 2012;3:482–495. doi: 10.1109/T-AFFC.2012.14

Fernández, C., Pascual, J. C., Soler, J., Elices, M., Portella, M. J., & Fernández-Abascal, E. (2012). Physiological responses induced by emotion-eliciting films. Applied psychophysiology and biofeedback, 37, 73-79.

Hewig, J. et al. A revised film set for the induction of basic emotions. Cognition and emotion 19, 1095 (2005).

Kaczmarek, L. D. et al. Splitting the affective atom: Divergence of valence and approach-avoidance motivation during a dynamic emotional experience. Current Psychology 1–12 (2019).

Gross, J. J. & Levenson, R. W. Emotion elicitation using films. Cognition & emotion 9, 87–108 (1995).

Schaefer, A., Nils, F., Sanchez, X. & Philippot, P. Assessing the effectiveness of a large database of emotion-eliciting films: A new tool for emotion researchers. Cognition and emotion 24, 1153–1172 (2010).

Reynaud, E., El-Khoury-Malhame, M., Blin, O. & Khalfa, S. Voluntary emotion suppression modifies psychophysiological responses to films. Journal of Psychophysiology 26, 116 (2012).

Smith, C. A., Haynes, K. N., Lazarus, R. S., & Pope, L. K. (1993). In search of the "hot" cognitions: Attributions, appraisals, and their relation to emotion. Journal of Personality and Social Psychology, 65(5), 916–929.

Breiman, L., 2001. Random forests. Machine learning, 45, pp.5-32.

Hasan, M. and Boris, F., 2006. Svm: Machines à vecteurs de support ou séparateurs à vastes marges. Rapport technique, Versailles St Quentin, France. Cité, 64.

Gouizi, K., Bereksi Reguig, F. and Maaoui, C., 2011. Emotion recognition from physiological signals. Journal of medical engineering & technology, 35(6-7), pp.300-307.

Jovic, A. and Bogunovic, N., 2010. Random forest-based classification of heart rate variability signals by using combinations of linear and nonlinear features. In XII Mediterranean Conference on Medical and Biological Engineering and Computing 2010: May 27–30, 2010 Chalkidiki, Greece (pp. 29-32). Springer Berlin Heidelberg.

Rajput, S. and Arora, A., 2013. Designing spam model-classification analysis using decision trees. International Journal of Computer Applications, 75(10), pp.6-12.

Philippot, P., Chapelle, G. and Blairy, S., 2002. Respiratory feedback in the generation of emotion. Cognition & Emotion, 16(5), pp.605-627.

Barrett, L.F. and Russell, J.A., 1999. The structure of current affect: Controversies and emerging consensus. Current directions in psychological science, 8(1), pp.10-14.

Balters, Stephanie, and Martin Steinert. "Capturing emotion reactivity through physiology measurement as a foundation for affective engineering in engineering design science and engineering practices." Journal of Intelligent Manufacturing (2015): 1-23.

Regan L. Mandryk, Kori M. Inkpen & Thomas W. Calvert, Using psychophysiological techniques to measure user experience with entertainment technologies, Behaviour & Information Technology Vol. 25 , Iss. 2,2006

Kaveh Bakhtiyari, Hafizah Husain, Fuzzy Model on Human Emotions Recognition, In: 12th international conference on applications of computer engineering, pp 77–82, 2013




DOI: https://doi.org/10.31449/inf.v49i21.7442

Creative Commons License
This work is licensed under a Creative Commons Attribution 3.0 License.