PhD Proposal: Towards Multimodal and Context-Aware Emotion Perception

Talk
Trisha Mittal
Time: 
04.29.2022 12:00 to 14:00
Location: 

IRB 3137

Human emotion perception is an integral component of intelligent systems' wide range of applications, including behavior prediction, social robotics, medicine, surveillance, and entertainment. Current literature advocates that humans perceive emotions and behavior from various human modalities and also the situational and background context. Our research focuses on this aspect of emotion perception, as we attempt to build emotion perception models from multiple modalities and contextual cues, as well as use such ideas of perception for various real-world domains of AI applications. We will go over both parts in this talk. In the first part, we will go through two approaches for better and improved emotion perception models. In one approach, we will leverage the idea of using more than one modality to perceive human emotion. In the other approach, we leverage contextual information; background scene, multiple modalities of the human subject and socio-dynamic inter-agent interactions available in the input to predict the perceived emotion. In the second part, we will explore three domains of AI applications; i) video manipulations and deepfake detection, ii) multimedia content analysis, and iii) social media interactions investigation to enrich solutions to them with ideas from emotion perception.Examining Committee:

Chair:Department Representative:

Dr. Dinesh Manocha Dr. Ramani Duraiswami Dr. Vanessa Frias-Martinez Dr. Viswanathan Swaminathan Dr. Aniket Bera