Manifold filter-combine networks

David R. Johnson, Joyce A. Chew, Siddharth Viswanath, Edward De Brouwer, Deanna Needell, Smita Krishnaswamy, Michael Perlmutter

Research output: Contribution to journalArticlepeer-review

Abstract

In order to better understand manifold neural networks (MNNs), we introduce Manifold Filter-Combine Networks (MFCNs). Our filter-combine framework parallels the popular aggregate-combine paradigm for graph neural networks (GNNs) and naturally suggests many interesting families of MNNs which can be interpreted as manifold analogues of various popular GNNs. We propose a method for implementing MFCNs on high-dimensional point clouds that relies on approximating an underlying manifold by a sparse graph. We then prove that our method is consistent in the sense that it converges to a continuum limit as the number of data points tends to infinity, and we numerically demonstrate its effectiveness on real-world and synthetic data sets.

Original languageEnglish
Article number17
JournalSampling Theory, Signal Processing, and Data Analysis
Volume23
Issue number2
DOIs
StatePublished - Dec 2025

Keywords

  • Geometric deep learning
  • Manifold learning
  • Manifold neural networks

Fingerprint

Dive into the research topics of 'Manifold filter-combine networks'. Together they form a unique fingerprint.

Cite this