Abstract
In order to better understand manifold neural networks (MNNs), we introduce Manifold Filter-Combine Networks (MFCNs). Our filter-combine framework parallels the popular aggregate-combine paradigm for graph neural networks (GNNs) and naturally suggests many interesting families of MNNs which can be interpreted as manifold analogues of various popular GNNs. We propose a method for implementing MFCNs on high-dimensional point clouds that relies on approximating an underlying manifold by a sparse graph. We then prove that our method is consistent in the sense that it converges to a continuum limit as the number of data points tends to infinity, and we numerically demonstrate its effectiveness on real-world and synthetic data sets.
| Original language | English |
|---|---|
| Article number | 17 |
| Journal | Sampling Theory, Signal Processing, and Data Analysis |
| Volume | 23 |
| Issue number | 2 |
| DOIs | |
| State | Published - Dec 2025 |
Keywords
- Geometric deep learning
- Manifold learning
- Manifold neural networks
Fingerprint
Dive into the research topics of 'Manifold filter-combine networks'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver