Search published articles


Showing 1 results for Multimodal Emotions

Seyed Sadegh Hosseini, Mohammad Reza Yamaghani,
Volume 27, Issue 4 (10-2024)
Abstract

Introduction: Nowadays, the use of artificial intelligence and machine learning has impacted all fields of study. Utilizing these methods for identifying individuals' emotions through integrating audio, text, and image data has shown higher accuracy than conventional methods, presenting various applications for psychologists and human-machine interaction. Identifying human emotions and individuals' reactions is crucial in psychology and psychotherapy. Emotional identification has traditionally been conducted individually and by analyzing facial expressions, speech patterns, or handwritten responses to stimuli and events. However, depending on the subject's conditions or the analyst's circumstances, this approach may lack the required accuracy. This paper aimed to achieve high-precision emotional recognition from audio, text, and image data using artificial intelligence and machine learning methods.
Methods: This research employs a correlation-based approach between emotions and input data, utilizing machine learning methods and regression analysis to predict a criterion variable based on multiple predictor variables (the emotional category as the criterion variable and the features, audio, image, and text variables as predictors). The statistical population of this study is the IEMOCAP dataset, and the data type of this research is a mixed quantitative-qualitative approach.
Results: The results indicated that combining audio, image, and text data for multi-modal emotional recognition significantly outperformed the recognition of emotions from each data alone, exhibiting a precision of 82.9% in the baseline dataset.
Conclusions: The results demonstrate a considerably acceptable precision in identifying human emotions through audio integration, text, and image data compared to individual data when using machine learning and artificial intelligence methods.

Page 1 from 1     

© 2025 CC BY-NC 4.0 | Journal of Arak University of Medical Sciences

Designed & Developed by : Yektaweb