PhD Proposal: Multi-Modal Fault Tolerant Learning

Talk
Isabelle Rathbun
Time: 
04.23.2026 12:00 to 13:30

Modern intelligent systems rely on multi-sensor measurements to perceive and interpret their operational environments. The data collected from different sensing modalities supports a variety of downstream tasks ranging from perception to mapping and control. The performance on these tasks is integrally linked to the reliability and availability of the underlying sensor measurements. However, these sensors are inherently imperfect and can experience hard, soft, or intermittent failures due to environmental conditions, hardware degradations, and communication disruptions. When sensor measurements become corrupted or unavailable, the algorithms relying on them may produce deteriorated outputs, potentially causing cascading failure in higher-level processing pipelines.
This proposal investigates improving the robustness of multi-sensor systems by replacing failed sensors with generated virtual sensors. We propose a theoretical framework that combines generative artificial intelligence with principles from sensor fusion and multi-sensor integration to synthesize sensor measurements using information from the remaining operational modalities. This approach enables systems to continue performing critical tasks at a high level even when physical sensors fail.