Graph Convolutional Neural Networks as Parametric CoKleisli morphisms

December 01, 2022 ยท The Ethereal ยท ๐Ÿ› arXiv.org

๐Ÿ”ฎ THE ETHEREAL: The Ethereal
Pure theory โ€” exists on a plane beyond code

"No code URL or promise found in abstract"

Evidence collected by the PWNC Scanner

Authors Bruno Gavranoviฤ‡, Mattia Villani arXiv ID 2212.00542 Category math.CT: Category Theory Cross-listed cs.LG Citations 0 Venue arXiv.org Last Checked 1 month ago
Abstract
We define the bicategory of Graph Convolutional Neural Networks $\mathbf{GCNN}_n$ for an arbitrary graph with $n$ nodes. We show it can be factored through the already existing categorical constructions for deep learning called $\mathbf{Para}$ and $\mathbf{Lens}$ with the base category set to the CoKleisli category of the product comonad. We prove that there exists an injective-on-objects, faithful 2-functor $\mathbf{GCNN}_n \to \mathbf{Para}(\mathsf{CoKl}(\mathbb{R}^{n \times n} \times -))$. We show that this construction allows us to treat the adjacency matrix of a GCNN as a global parameter instead of a a local, layer-wise one. This gives us a high-level categorical characterisation of a particular kind of inductive bias GCNNs possess. Lastly, we hypothesize about possible generalisations of GCNNs to general message-passing graph neural networks, connections to equivariant learning, and the (lack of) functoriality of activation functions.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

๐Ÿ“œ Similar Papers

In the same crypt โ€” Category Theory

๐Ÿ”ฎ ๐Ÿ”ฎ The Ethereal

Algebraic Databases

Patrick Schultz, David I. Spivak, ... (+2 more)

math.CT ๐Ÿ› Theory and Applications of Categories ๐Ÿ“š 35 cites 10 years ago