Application of human-computer interaction technology integrating biomimetic vision system in animation design with a biomechanical perspective
Abstract
The combination of Human-Computer Interaction (HCI) technology with biomimetic vision systems has transformational potential in animation design, particularly by incorporating biomechanical principles to create immersive and interactive experiences. Traditional animation approaches frequently lack sensitivity to real-time human motions, which can restrict engagement and realism. This study addresses this constraint by creating a framework that uses Virtual Reality (VR) and Augmented Reality (AR) to generate dynamic settings that include a variety of human activities, informed by biomechanical analysis. A biomimetic vision system is used to record these motions with wearable sensors, allowing for precise monitoring of user activity while considering biomechanical factors such as joint angles, force distribution, and movement patterns. The recorded data is preprocessed using Z-score normalization methods and extracted using Principal Component Analysis (PCA). This study proposed an Egyptian Vulture optimized Adjustable Long Short-Term Memory Network (EVO-ALSTM) technique for motion classification, specifically tailored to recognize biomechanical characteristics of human movements. Results demonstrate a significant improvement in precision (93%), F1-score (91%), accuracy (95%), and recall (90%) for the motion recognition system, highlighting the effectiveness of biomechanical insights in enhancing animation design. The findings indicate that integrating real-time biomechanical data into the animation process leads to more engaging and realistic user experiences. This study not only advances the subject of HCI but also provides the framework for future investigations into sophisticated animation technologies that use biomimetic and biomechanical systems.
References
1. Wee, C., Yap, K.M. and Lim, W.N., 2021. Haptic interfaces for virtual reality: Challenges and research directions. IEEE Access, 9, pp.112145-112162.
2. Wang, G., Zheng, C., Fu, Y., Zhu, K., Lai, F., Zhang, L., Li, M., Wu, X., Ren, M., Zheng, Y. and Lian, B., 2024, July. KiPneu: Designing a Constructive Pneumatic Platform for Biomimicry Learning in STEAM Education. In Proceedings of the 2024 ACM Designing Interactive Systems Conference (pp. 441-458).
3. Han, L., Afzal, N., Wang, Z., Wang, Z., Jin, T., Guo, S., Gong, H. and Wang, D., 2024. Ambient haptics: bilateral interaction among humans, machines, and virtual/real environments in the pervasive computing era. CCF Transactions on Pervasive Computing and Interaction, pp.1-33.
4. Sauda, E., Karduni, A. and Lanclos, D., 2024. Architecture in the Age of Human-Computer Interaction. Taylor & Francis.
5. CHEN, Y., WU, X., ZHANG, J., LIU, Y. and LI, H., 2024. A Type of Human-Computer Collaborative.
6. Coban, M. and Coştu, B., 2023. Integration of biomimicry into science education: Biomimicry teaching approach. Journal of Biological Education, 57(1), pp.145-169.
7. Soliman, M.E. and Bo, S., 2023. An innovative multifunctional biomimetic adaptive building envelope based on a novel integrated methodology of merging biological mechanisms. Journal of Building Engineering, 76, p.106995.
8. Jović, B.S. and Mitić, A.D., 2020. Exploration of nature-based biomimetic approach in landscape architectural design: a parametric study of candelabra model design. Visual Computing for Industry, Biomedicine, and Art, 3, pp.1-11.
9. Li, S., 2021. Ancient Architecture Animation Design Method of 3D Technology and Its Application. In Journal of Physics: Conference Series (Vol. 2037, No. 1).
10. Pezent, E., Macklin, A., Yau, J.M., Colonnese, N. and O’Malley, M.K., 2023. Multisensory Pseudo‐Haptics for Rendering Manual Interactions with Virtual Objects. Advanced Intelligent Systems, 5(5), p.2200303.
11. Gallerani, M., Vazzoler, G., De Novi, G., Razzoli, R., Berselli, G. and Ottensmeyer, M.P., 2023. Integrated design and prototyping of a robotic eye system for ocular and craniofacial trauma simulators. International Journal on Interactive Design and Manufacturing (IJIDeM), 17(6), pp.3103-3116.
12. Shreyas, D.G., Raksha, S. and Prasad, B.G., 2020. Implementation of an anomalous human activity recognition system. SN Computer Science, 1(3), p.168.
13. Mahbub, U. and Ahad, M.A.R., 2022. Advances in human action, activity, and gesture recognition. Pattern Recognition Letters, 155, pp.186-190.
14. Minh Trieu, N. and Thinh, N.T., 2024. Advanced Design and Implementation of a Biomimetic Humanoid Robotic Head Based on Vietnamese Anthropometry. Biomimetics, 9(9), p.554.
15. McMahon, M. and Erolin, C., 2024. Biomimicry–medical design concepts inspired by nature. Journal of Visual Communication in Medicine, 47(1), pp.27-38.
16. Dai, F. and Li, Z., 2024. Research on 2D Animation Simulation Based on Artificial Intelligence and Biomechanical Modeling. EAI Endorsed Transactions on Pervasive Health and Technology, 10.
17. Li, Y., Zhao, M., Yan, Y., He, L., Wang, Y., Xiong, Z., Wang, S., Bai, Y., Sun, F., Lu, Q. and Wang, Y., 2022. Multifunctional biomimetic tactile system via a stick-slip sensing strategy for human-machine interactions. NPJ Flexible electronics, 6(1), p.46.
18. Reddy, E.V., Manideep, K. and Agarwal, O., 2024. Machine Learning-Based Human Movements Mimicking System for Animation and Virtual Reality. Asian Journal of Research in Computer Science, 17(7), pp.84-94.
19. Gao, Q., 2022. Design and Implementation of a 3D Animation Data Processing Development Platform Based on Artificial Intelligence. Computational Intelligence and Neuroscience, 2022(1), p.1518331.
20. Vignesh, T., 2021, March. Pipeline for Development of 3-dimensional motion animation using 2-dimensional video. In 2021 7th International Conference on Advanced Computing and Communication Systems (ICACCS) (Vol. 1, pp. 991-995). IEEE.
21. Kong, D., Yang, G., Pang, G., Ye, Z., Lv, H., Yu, Z., Wang, F., Wang, X.V., Xu, K. and Yang, H., 2022. Bioinspired Co‐Design of Tactile Sensor and Deep Learning Algorithm for Human–human-robot interaction. Advanced Intelligent Systems, 4(6), p.2200050.
22. Kim, S.J., Lee, Y.J. and Park, G.M., 2021. Real-Time Joint Animation Production and Expression System using Deep Learning Model and Kinect Camera. Journal of Broadcast Engineering, 26(3), pp.269-282.
23. Fu, Q., Fu, J., Zhang, S., Li, X., Guo, J. and Guo, S., 2021. Design of intelligent human-computer interaction system for hard of hearing and non-disabled people. IEEE Sensors Journal, 21(20), pp.23471-23479.
24. Qin, X., Xia, X., Ge, Z., Liu, Y. and Yue, P., 2024. The Design and Control of a Biomimetic Binocular Cooperative Perception System Inspired by the Eye Gaze Mechanism. Biomimetics, 9(2), p.69.
25. Lu, W., 2024. Learning-Based, Muscle-Actuated Biomechanical Human Animation: Bipedal Locomotion Control and Facial Expression Transfer (Doctoral dissertation, UCLA).
26. Zeng, X.S., Dwarakanath, S., Lu, W., Nakada, M. and Terzopoulos, D., 2021. Neuromuscular Control of the Face-Head-Neck Biomechanical Complex with Learning-Based Expression Transfer from Images and Videos. In Advances in Visual Computing: 16th International Symposium, ISVC 2021, Virtual Event, October 4-6, 2021, Proceedings, Part I (pp. 116-127). Springer International Publishing.
27. Kumarapu, L. and Mukherjee, P., 2021. Animepose: Multi-person 3d pose estimation and animation. Pattern Recognition Letters, 147, pp.16-24.
28. Jones, M., Byun, C., Johnson, N. and Seppi, K., 2023. Understanding the Roles of Video and Sensor Data in the Annotation of Human Activities. International Journal of Human-Computer Interaction, 39(18), pp.3634-3648.
29. Yadav, R.K., Arockiam, D. and Bhaskar Semwal, V., 2024. Motion Signal-based Recognition of Human Activity from Video Stream Dataset Using Deep Learning Approach. Recent Advances in Computer Science and Communications (Formerly: Recent Patents on Computer Science), 17(3), pp.77-91.
Copyright (c) 2024 Jing Han
This work is licensed under a Creative Commons Attribution 4.0 International License.
Copyright on all articles published in this journal is retained by the author(s), while the author(s) grant the publisher as the original publisher to publish the article.
Articles published in this journal are licensed under a Creative Commons Attribution 4.0 International, which means they can be shared, adapted and distributed provided that the original published version is cited.