PhD Defense: Towards Multimodal and Context-Aware Emotion Perception

Talk
Trisha Mittal
Time: 
04.27.2023 11:00 to 13:00
Location: 

Human emotion perception is an integral component of intelligent systems' wide range of applications, including behavior prediction, social robotics, medicine, surveillance, and entertainment. Current literature advocates that humans perceive emotions and behavior from various human modalities and also the situational and background context. Our research focuses on this aspect of emotion perception, as we attempt to build emotion perception models from multiple modalities and contextual cues, as well as use such ideas of perception for various real-world domains of AI applications. We will go over both parts in this talk. In the first part, we will go through two approaches for better and improved emotion perception models. In one approach, we will leverage the idea of using more than one modality to perceive human emotion. In the other approach, we leverage contextual information; background scene, multiple modalities of the human subject, and socio-dynamic inter-agent interactions available in the input to predict the perceived emotion. In the second part, we will explore three domains of AI applications; i) video manipulations and deepfake detection, ii) multimedia content analysis, and iii) social media interactions investigation to enrich solutions to them with ideas from emotion perception.

Examining Committee

Chair:

Dr. Dinesh Manocha

Dean's Representative:

Dr. Min Wu

Members:

Dr. Ramani Duraiswami

Dr. Aniket Bera

Dr. Viswanathan Swaminathan (Adobe Research)