PhD Proposal: Physical Intelligence - Embedded, Embodied, And Everywhere
The next generation of computing systems is moving beyond the cloud to become tightly coupled with the physical world, enabling a form of “Physical Intelligence” that is embedded in our environments, embodied in our wearables, and present everywhere in daily life. Yet enabling machines to truly understand spatial layouts, material properties, and human physiological context remains difficult: current sensing pipelines often miss important directional structure, and purely data-driven models struggle to generalize across changing physical conditions. This dissertation proposes a unified framework for building spatially-aware systems by bridging signal physics and deep learning, using Implicit Neural Representations (INRs) constrained by wave propagation to learn compact and generalizable models of the physical world.
The contributions are organized around three intertwined domains of physical intelligence: Embedded, Everywhere, and Embodied. For Embedded intelligence, we develop frequency-domain neural fields that capture how sound propagates in complex, confined spaces, enabling high-fidelity rendering and control without extensive manual tuning. For Everywhere intelligence, we advance volumetric reconstruction by operating directly on wave-based measurements, allowing high-quality imaging from sparse, near-field observations and using spectral cues to infer how people are oriented with respect to each other. For Embodied intelligence, we extend sensing into the human body and mind, using ultrasonic signals and tiny internal muscle movements to detect shifts in attention in real time on commodity devices. Taken together, these works show that grounding neural models in signal physics can yield systems that are not only perceptually accurate but physically intelligent—able to reason about the surrounding geometry, the properties of objects, and aspects of a user’s cognitive state.