Training biomechanics into emotion recognition of athletes based on time series change characteristics of expression
Abstract
Emotion management is a critical psychological skill for athletes, deserving significant attention during the foundational training phase. Early intervention in athletes’ emotional states allows coaches to devise effective training plans, ultimately enhancing competition performance. Emotions can be evaluated through facial expressions and physiological signals, providing a basis for detecting athletes’ training emotions. This study explores the biomechanical characteristics of facial expressions to recognize emotions. First, the relationship between facial expressions and facial Action Units (AUs) is analyzed, identifying AU combinations that effectively represent facial expressions. Considering the temporal and spatial dynamics of facial expressions, AUs are used as feature inputs for an SVM model to classify athletes’ training emotions. Subsequently, emotional expressions are mapped to corresponding emotional states, introducing an emotion index to quantify athletes’ training emotions. Finally, quantitative emotion recognition is achieved based on transient facial expressions observed during training. This research provides a theoretical framework for coaches and athletes to enhance emotional management through biomechanically informed training approaches.
References
1. Schmidt KL, Cohn JF. Dynamics of facial expression: normative characteristics and individual differences. EEE International Conference on Multimedia and Expo, 2001. ICME 2001., Tokyo, Japan, 2001, pp. 547-550, doi: 10.1109/ICME.2001.1237778.
2. Mondal A, Mukhopadhyay P, Basu N, et al. Identification of unique characteristics of deception from facial expression. Current Science: A Fortnightly Journal of Research, 2018.
3. Kuramoto E, Yoshinaga S, Nakao H, et al. Characteristics of facial muscle activity during voluntary facial expressions: Imaging analysis of facial expressions based on myogenic potential data. Neuropsychopharmacology Reports, 2019, 39(3).
4. Murtaza A, Khan I, Abbas M, et al. Optimal timing of artificial insemination and changes in vaginal mucous characteristics relative to the onset of standing estrus in Beetal goats. Animal Reproduction Science, 2019, 213:106249.
5. Tan X, Zhaowei L I, Fan Y. Facial Expression Recognition Method Based on Multi-scale Detail Enhancement. Journal of Electronics & Information Technology, 2019.
6. Zhou Z, Huang J. Study of RCS characteristics of tilt-rotor aircraft based on dynamic calculation approach. Aviation Journal of China: English edition, 2022, 35(4):12.
7. Kim HY, Hyun ER. Characteristics of the Changes in the Images and Colors of the UI (University Identity) of Private Universities in Seoul. Journal of the Korean Society Design Culture, 2019.
8. Sun Y, Wu C, Zheng K, et al. Adv-Emotion: The Facial Expression Adversarial Attack. International Journal of Pattern Recognition and Artificial Intelligence, 2021.
9. Risako I, Miyuki O, Mikiko M, et al. Facial Expression Recognition by Depressed Patients with Bipolar Spectrum. Japanese Journal of Psychosomatic Medicine, 2019, 59(2):155-163.
10. Reneiro AV, Chandratilak DSL, Mohammad IPHP, et al. Visual data of facial expressions for automatic pain detection. Journal of Visual Communication and Image Representation, 2019, 61:209-217.
11. Satoshi N, Nihei Y, Nakauchi S, Minami T. Contribution of Facial Color to Expression Recognition of Blurred Faces. Transactions of Japan Society of Kansei Engineering, 2019, 18(1):79-85.
12. Wang Q, Yang H, Yu Y. Facial expression video analysis for depression detection in Chinese patients. Journal of Visual Communication & Image Representation, 2018, 57(NOV.): 228-233.
13. Sariyanidi E, Zampella CJ, Schultz RT, et al. Can Facial Pose and Expression Be Separated With Weak Perspective Camera? // 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). IEEE, 2020.
14. Zhang Y. A Brief Analysis of Aesthetic Characteristics and Artistic Practice of Physical Performance. Journal of Heihe University, 2018.
15. Graham DJ, Mccarron L, Phillips T, Sivakumar B. Range of motion required for auslan: a biomechanical analysis. ANZ Journal of Surgery. 2023, 93(7-8):1930-1934.
16. Mohammadi M, Yazdi HS, Moallem T. A vision-based approach in biomechanical motion analysis using a novel tracker. 04th Iranian Conference on Machine Vision and Image Processing,Mashhad. https://civilica.com/doc/44281
17. Huang X, Sharma A, Shabaz M. Biomechanical research for running motion based on dynamic analysis of human multi-rigid body model. International Journal of System Assurance Engineering and Management, 2022, 13(1), 615-624.
18. Omoko ID, Abdurasid AA, Akewushola JR, Salami OA. Qualitative analysis of solutions of a nonlinear biomechanical model. Journal of Applied Mathematics and Physics, 2024, 12(11), 12.
19. Min YS, Jung TD, Lee YS, Kwon Y, Kim HJ, Kim HC, et al. Biomechanical gait analysis using a smartphone-based motion capture system (opencap) in patients with neurological disorders. Bioengineering (Basel), 2024, 11(9).
Copyright (c) 2025 Haiyan Song, Hongwei Chen, Shu Zhang
This work is licensed under a Creative Commons Attribution 4.0 International License.
Copyright on all articles published in this journal is retained by the author(s), while the author(s) grant the publisher as the original publisher to publish the article.
Articles published in this journal are licensed under a Creative Commons Attribution 4.0 International, which means they can be shared, adapted and distributed provided that the original published version is cited.