๐ฎ
๐ฎ
The Ethereal
Graph Convolutional Neural Networks as Parametric CoKleisli morphisms
December 01, 2022 ยท The Ethereal ยท ๐ arXiv.org
"No code URL or promise found in abstract"
Evidence collected by the PWNC Scanner
Authors
Bruno Gavranoviฤ, Mattia Villani
arXiv ID
2212.00542
Category
math.CT: Category Theory
Cross-listed
cs.LG
Citations
0
Venue
arXiv.org
Last Checked
1 month ago
Abstract
We define the bicategory of Graph Convolutional Neural Networks $\mathbf{GCNN}_n$ for an arbitrary graph with $n$ nodes. We show it can be factored through the already existing categorical constructions for deep learning called $\mathbf{Para}$ and $\mathbf{Lens}$ with the base category set to the CoKleisli category of the product comonad. We prove that there exists an injective-on-objects, faithful 2-functor $\mathbf{GCNN}_n \to \mathbf{Para}(\mathsf{CoKl}(\mathbb{R}^{n \times n} \times -))$. We show that this construction allows us to treat the adjacency matrix of a GCNN as a global parameter instead of a a local, layer-wise one. This gives us a high-level categorical characterisation of a particular kind of inductive bias GCNNs possess. Lastly, we hypothesize about possible generalisations of GCNNs to general message-passing graph neural networks, connections to equivariant learning, and the (lack of) functoriality of activation functions.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
๐ Similar Papers
In the same crypt โ Category Theory
๐ฎ
๐ฎ
The Ethereal
Algebraic Databases
๐ฎ
๐ฎ
The Ethereal
Open Diagrams via Coend Calculus
๐ฎ
๐ฎ
The Ethereal
Executions in (Semi-)Integer Petri Nets are Compact Closed Categories
๐ฎ
๐ฎ
The Ethereal
Compositional Scientific Computing with Catlab and SemanticModels
๐ฎ
๐ฎ
The Ethereal