Improved Attention-Enhanced Efficient Face-Transformer Model for Multimodal Elderly Emotion Recognition in Smart Homes
Abstract
Recognizing the emotions of the elderly is key to achieving personalized services insmart home environments. Traditional methods have difficulty capturi ng the correlation and temporal information of multimodal features.To this end, the study proposes a multimodal emotion recognition model that integrates the EfficientFace and Transforme r structures to construct an improved attention mechanism. A modal in teraction compensation termis introduced into the similarity calculation to improve the ability to model dynamic dependencies between modalities. Meanwhile, the dynamic importance factor is used to adaptively adjust feature weights. The model was tested o n IEMOCAP and self constructed EMED datasets. The emotion recognition precision reached up to 94.37%, the recall rate reached 93.64%, the F1 value was 94.21, and the specificity reached 9 4.85%. Additionally, the model achieved 96.24% classification accurac y and 94.13% emotional intensity on easily confused categories, such as "disgust" and "contempt," with a minimum detection latency of 0.55 seconds. Theresults show that the model exhibit s excellent performance in multimodal fusion and emotion recognition for the elderly, and is suitable for the task of smart home emotion monitoring.References
Mangano G, Ferrari A, Rafele C, Vezzetti E, Marcolin F. Willingness of sharing facial data for emotion recognition: a case study in the insurance market. AI & SOCIETY, 2024, 39(5): 2373-2384.
Zhu X, Huang Y, Wang X, Wang R. Emotion recognition based on brain-like multimodal hierarchical perception. Multimedia Tools and Applications, 2024, 83(18): 56039-56057.
Feng G, Wang H, Wang M, Zheng X, Zhang R. A Research on Emotion Recognition of the Elderly Based on Transformer and Physiological Signals. Electronics, 2024, 13(15): 3019-3028.
Sreevidya P, Veni S, Ramana Murthy O V. Elder emotion classification through multimodal fusion of intermediate layers and cross-modal transfer learning. Signal, image and video processing, 2022, 16(5): 1281-1288.
Park H, Shin Y, Song K, Yun C, Jang D. Facial emotion recognition analysis based on age-biased data. Applied Sciences, 2022, 12(16): 7992-7998.
Lu J, Liu Y, Lv T, Meng L. An emotional-aware mobile terminal accessibility-assisted recommendation system for the elderly based on haptic recognition. International Journal of Human–Computer Interaction, 2024, 40(22): 7593-7609.
Du J, Yin J, Chen X, Hassan A, Fu E, Li X. Electroencephalography (EEG)-based neural emotional response to flower arrangements (FAs) on normal elderly (NE) and cognitively impaired elderly (CIE). International Journal of Environmental Research and Public Health, 2022, 19(7): 3971.
Zhao Y, Guo M, Sun X, Chen X, Zhao F. Attention‐based sensor fusion for emotion recognition from human motion by combining convolutional neural network and weighted kernel support vector machine and using inertial measurement unit signals[J]. IET Signal Processing, 2023, 17(4): e12201.
Saganowski S. Bringing emotion recognition out of the lab into real life: Recent advances in sensors and machine learning. Electronics, 2022, 11(3): 496-503.
Zhou S, Wu X, Jiang F, Huang Q, Huang C. Emotion recognition from large-scale video clips with cross-attention and hybrid feature weighting neural networks. International Journal of Environmental Research and Public Health, 2023, 20(2): 1400-1431.
Lin W, Li C. Review of studies on emotion recognition and judgment based on physiological signals. Applied Sciences, 2023, 13(4): 2573-2577.
Cai Y, Li X, Li J. Emotion recognition using different sensors, emotion models, methods and datasets: A comprehensive review. Sensors, 2023, 23(5): 2455-2461.
Tamer Ghareeb B, Tarek F A, Said H A. FER_ML: Facial Emotion Recognition using Machine Learning. Journal of Computing and Communication, 2023, 2(1): 40-49.
Lin W, Li C. Review of studies on emotion recognition and judgment based on physiological signals. Applied Sciences, 2023, 13(4): 2573-2579.
Zhang T, El Ali A, Wang C. Weakly-supervised learning for fine-grained emotion recognition using physiological signals. IEEE Transactions on Affective Computing, 2022, 14(3): 2304-2322.
Liu X, Huang C, Zhu H. State-of-the-Art Elderly Service Robot: Environmental Perception, Compliance Control, Intention Recognition, and Research Challenges. IEEE Systems, Man, and Cybernetics Magazine, 2024, 10(1): 2-16.
Almukadi W. Smart Scarf: An IOT-based Solution for Emotion Recognition. Engineering, Technology & Applied Science Research, 2023, 13(3): 10870-10874.
Patel A, Shah J. Towards enhancing the health standards of elderly: role of ambient sensors and user perspective. International Journal of Engineering Systems Modelling and Simulation, 2022, 13(1): 96-110.
Younis E M G, Zaki S M, Kanjo E. Evaluating ensemble learning methods for multi-modal emotion recognition using sensor data fusion. Sensors, 2022, 22(15): 5611-5617.
Hebbi C, Mamatha H. Comprehensive Dataset Building and Recognition of Isolated Handwritten Kannada Characters Using Machine Learning Models. Artificial Intelligence and Applications, 2023, 1(3):179-190.
DOI:
https://doi.org/10.31449/inf.v49i33.8838Downloads
Published
How to Cite
Issue
Section
License
I assign to Informatica, An International Journal of Computing and Informatics ("Journal") the copyright in the manuscript identified above and any additional material (figures, tables, illustrations, software or other information intended for publication) submitted as part of or as a supplement to the manuscript ("Paper") in all forms and media throughout the world, in all languages, for the full term of copyright, effective when and if the article is accepted for publication. This transfer includes the right to reproduce and/or to distribute the Paper to other journals or digital libraries in electronic and online forms and systems.
I understand that I retain the rights to use the pre-prints, off-prints, accepted manuscript and published journal Paper for personal use, scholarly purposes and internal institutional use.
In certain cases, I can ask for retaining the publishing rights of the Paper. The Journal can permit or deny the request for publishing rights, to which I fully agree.
I declare that the submitted Paper is original, has been written by the stated authors and has not been published elsewhere nor is currently being considered for publication by any other journal and will not be submitted for such review while under review by this Journal. The Paper contains no material that violates proprietary rights of any other person or entity. I have obtained written permission from copyright owners for any excerpts from copyrighted works that are included and have credited the sources in my article. I have informed the co-author(s) of the terms of this publishing agreement.
Copyright © Slovenian Society Informatika







