Published February 25, 2021 | Version v1
Presentation Open

Learning from graphs: a spectral perspective

  • 1. EPFL

Description

The architecture of a neural network constrains the space of functions it can implement. Equivariance is one such constraint—enabling weight sharing and guaranteeing generalization. But symmetries alone might not be enough: for example, social networks, finite grids, and sampled spheres have few automorphisms. I will discuss how spectral graph theory yields vertex representations and a generalized convolution that shares weights beyond symmetries.

Files

learning_from_graphs.pdf

Files (2.7 MB)

Name Size Download all
md5:151dc77f72e1ea42ebf7624da6f7bc85
2.7 MB Preview Download