Published February 25, 2021 | Version v1
Presentation Open

Learning from graphs: a spectral perspective

  • 1. EPFL


The architecture of a neural network constrains the space of functions it can implement. Equivariance is one such constraint—enabling weight sharing and guaranteeing generalization. But symmetries alone might not be enough: for example, social networks, finite grids, and sampled spheres have few automorphisms. I will discuss how spectral graph theory yields vertex representations and a generalized convolution that shares weights beyond symmetries.



Files (2.7 MB)

Name Size Download all
2.7 MB Preview Download