Fundamentals of AI CDT

Development site for the EIT FOAI CDT

View the Project on GitHub cwcyau/foai-cdt

Title

When does equivariance help?

Challenge

While theoretically appealing, there is an ongoing debate and a lack of clear quantitative understanding of the performance trade-offs associated with enforcing equivariance in neural networks versus allowing them to learn symmetries from data.

Description

This project will develop a mathematical and empirical framework to quantify the sample complexity and performance benefits of equivariant models. The student will investigate how factors like dataset size, the complexity of the symmetry group, and the use of data augmentation affect the performance gap between equivariant and standard architectures. The goal is to provide a clear understanding of when explicit equivariance provides a significant advantage, and when standard models can effectively learn the necessary symmetries from data alone.

Skills Required

Group theory, Statistical learning theory, information theory, deep learning fundamentals

Skills to be Developed

Equivariant neural networks, empirical benchmarking

Relevant Background Reading:

  1. E(n) Equivariant Graph Neural Networks https://arxiv.org/abs/2104.13478
  2. AlphaFold3 (example where equivariance was removed without drop in performance)
  3. Azizian, W. and Lelarge, M., 2020. Expressive power of invariant and equivariant graph neural networks. arXiv preprint arXiv:2006.15646.