Emotion recognition by skeleton-based spatial and temporal analysis

dc.authoridOGUZ, ABDULHALIK/0000-0003-4912-7697
dc.contributor.authorOguz, Abdulhalik
dc.contributor.authorErtugrul, Omer Faruk
dc.date.accessioned2024-12-24T19:27:03Z
dc.date.available2024-12-24T19:27:03Z
dc.date.issued2024
dc.departmentSiirt Üniversitesi
dc.description.abstractThis study introduces an automatic emotion recognition system (AER) focusing on skeletal-based kinematic datasets for enhanced human-computer interaction. Departing from conventional approaches, it achieves realtime emotion recognition in real-life situations. The dataset covers seven emotions and undergoes assessment by eight diverse machine and deep learning algorithms. A thorough investigation is undertaken by varying window sizes and data states, including raw positions and feature-extracted data. The findings imply that incorporating advanced techniques like joint-related feature extraction and robust classifier models yields promising outcomes. Dataset augmentation via varying window sizes enriches insights into real-world scenarios. Evaluations exhibit classification accuracy surpassing 99% for small windows, 94% for medium, and exceeding 88% for larger windows, thereby confirming the robust nature of the approach. Furthermore, we highlight window size's impact on emotion detection and the benefits of combining coordinate axes for efficiency and accuracy. The analysis intricately examines the contributions of features at both the joint and axis levels, assisting in making well-informed selections. The study's contributions include carefully curated datasets, transparent code, and models, all of which ensure the possibility of replication. The paper establishes a benchmark that bridges theory and practicality, solidifying the proposed approach's effectiveness in balancing accuracy and efficiency. By pioneering advanced AER through kinematic data, it sets a new standard for efficacy while driving seamless human-computer interaction through rigorous analysis and strategic design.
dc.identifier.doi10.1016/j.eswa.2023.121981
dc.identifier.issn0957-4174
dc.identifier.issn1873-6793
dc.identifier.scopus2-s2.0-85175169287
dc.identifier.scopusqualityQ1
dc.identifier.urihttps://doi.org/10.1016/j.eswa.2023.121981
dc.identifier.urihttps://hdl.handle.net/20.500.12604/6486
dc.identifier.volume238
dc.identifier.wosWOS:001098754600001
dc.identifier.wosqualityQ1
dc.indekslendigikaynakWeb of Science
dc.indekslendigikaynakScopus
dc.language.isoen
dc.publisherPergamon-Elsevier Science Ltd
dc.relation.ispartofExpert Systems With Applications
dc.relation.publicationcategoryMakale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanı
dc.rightsinfo:eu-repo/semantics/closedAccess
dc.snmzKA_20241222
dc.subjectEmotion recognition
dc.subjectBody posture
dc.subjectDeep learning
dc.subjectMachine learning
dc.subjectFeature selection
dc.subjectMoCap
dc.titleEmotion recognition by skeleton-based spatial and temporal analysis
dc.typeArticle

Dosyalar