Deep Learning Techniques Applied to Affective Computing
Author | : Zhen Cui |
Publisher | : Frontiers Media SA |
Total Pages | : 151 |
Release | : 2023-06-14 |
ISBN-10 | : 9782832526361 |
ISBN-13 | : 2832526365 |
Rating | : 4/5 (61 Downloads) |
Download or read book Deep Learning Techniques Applied to Affective Computing written by Zhen Cui and published by Frontiers Media SA. This book was released on 2023-06-14 with total page 151 pages. Available in PDF, EPUB and Kindle. Book excerpt: Affective computing refers to computing that relates to, arises from, or influences emotions. The goal of affective computing is to bridge the gap between humans and machines and ultimately endow machines with emotional intelligence for improving natural human-machine interaction. In the context of human-robot interaction (HRI), it is hoped that robots can be endowed with human-like capabilities of observation, interpretation, and emotional expression. The research on affective computing has recently achieved extensive progress with many fields contributing including neuroscience, psychology, education, medicine, behavior, sociology, and computer science. Current research in affective computing concentrates on estimating human emotions through different forms of signals such as speech, face, text, EEG, fMRI, and many others. In neuroscience, the neural mechanisms of emotion are explored by combining neuroscience with the psychological study of personality, emotion, and mood. In psychology and philosophy, emotion typically includes a subjective, conscious experience characterized primarily by psychophysiological expressions, biological reactions, and mental states. The multi-disciplinary features of understanding “emotion” result in the fact that inferring the emotion of humans is definitely difficult. As a result, a multi-disciplinary approach is required to facilitate the development of affective computing. One of the challenging problems in affective computing is the affective gap, i.e., the inconsistency between the extracted feature representations and subjective emotions. To bridge the affective gap, various hand-crafted features have been widely employed to characterize subjective emotions. However, these hand-crafted features are usually low-level, and they may hence not be discriminative enough to depict subjective emotions. To address this issue, the recently-emerged deep learning (also called deep neural networks) techniques provide a possible solution. Due to the used multi-layer network structure, deep learning techniques are capable of learning high-level contributing features from a large dataset and have exhibited excellent performance in multiple application domains such as computer vision, signal processing, natural language processing, human-computer interaction, and so on. The goal of this Research Topic is to gather novel contributions on deep learning techniques applied to affective computing across the diverse fields of psychology, machine learning, neuroscience, education, behavior, sociology, and computer science to converge with those active in other research areas, such as speech emotion recognition, facial expression recognition, Electroencephalogram (EEG) based emotion estimation, human physiological signal (heart rate) estimation, affective human-robot interaction, multimodal affective computing, etc. We welcome researchers to contribute their original papers as well as review articles to provide works regarding the neural approach from computation to affective computing systems. This Research Topic aims to bring together research including, but not limited to: • Deep learning architectures and algorithms for affective computing tasks such as emotion recognition from speech, face, text, EEG, fMRI, and many others. • Explainability of deep Learning algorithms for affective computing. • Multi-task learning techniques for emotion, personality and depression detection, etc. • Novel datasets for affective computing • Applications of affective computing in robots, such as emotion-aware human-robot interaction and social robots, etc.