Abstract
Understanding proteins is one of several challenges in the natural sciences where machine learning could play a crucial role in enabling breakthroughs with broad societal benefits. What do these challenges have in common, and how do they differ from traditional deep learning applications like vision or NLP? I argue that a crucial aspect is the increased number of task symmetries when going to the length scales of molecules and below. In this talk, I present my PhD research on permutation invariance and roto-translation equivariance. Using the example of tasks on molecules, we examine how to leverage symmetries to solve these tasks efficiently and robustly. I provide a deep dive into the SE(3)-Transformer, but I also cover work on deep learning for sets, and how to leverage symmetries in generative modeling of molecules.
Speaker Bio
I studied physics in Erlangen, at Imperial College, and in Heidelberg. I developed an interest in using computational physics to understand biological processes and wrote my Master thesis on virus self-assembly in hydrodynamic flow. During my PhD, I worked on fundamental machine learning methods, specifically on how to leverage symmetries, often applied to problems in physics and biochemistry. I worked together with Ingmar Posner at the University of Oxford and later started collaborating with Max Welling. I recently interned at DeepMind where I was supervised by Adam Santoro.