Title: From Human Digitization to Virtual Try-On: Road to Physical Realism Abstract: E-Commerce has been growing at a rapid pace in recent years. People are now more likely to shop online than going to physical stores. Digital try-on systems, as one alternative way to improve the user experience and popularize online garment shopping, has drawn attention of many researchers. However, the technology is still far from being practical and easy-to-use to replace physical try-on, mostly due to the gap in modeling and in demonstrating garment-fitting between the digital and the real worlds. The estimation of the hidden parameters of the garments plays an important role in closing the gap. In this talk, we address the key open research issues above by learning the physical constraints through simulation. Our proposed learning-based frameworks focus on improving the efficiency, scalability, and capability of cloth simulation, and enable accurate hidden parameter estimation by exploiting cloth simulation for supervised learning and gradient-based feedback control. Bio: Junbang Liang is an Applied Scientist in Amazon Visual Shopping Team, where he works on shoppable video understanding and recommendation. He received his Ph.D. degree in 2021 from University of Maryland, under Prof. Ming Lin, where his primary research area is cloth simulation and machine learning. Prior to the current role, he works in Amazon Fashion Team on 3D virtual try-on (Show-It-On-Me).