A Convergence Rate for Manifold Neural Networks

Joyce A. Chew, Deanna Needell, Michael Perlmutter

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Scopus citations

Abstract

High-dimensional data arises in numerous applications, and the rapidly developing field of geometric deep learning seeks to develop neural network architectures to analyze such data in non-Euclidean domains, such as graphs and manifolds. Recent work has proposed a method for constructing manifold neural networks using the spectral decomposition of the Laplace-Beltrami operator. Moreover, in this work, the authors provide a numerical scheme for implementing such neural networks when the manifold is unknown and one only has access to finitely many sample points. They show that this scheme, which relies upon building a data-driven graph, converges to the continuum limit as the number of sample points tends to infinity. Here, we build upon this result by establishing a rate of convergence that depends on the intrinsic dimension of the manifold but is independent of the ambient dimension. We also discuss how the rate of convergence depends on the depth of the network and the number of filters used in each layer.

Original languageEnglish
Title of host publication2023 International Conference on Sampling Theory and Applications, SampTA 2023
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9798350328851
DOIs
StatePublished - 2023
Event2023 International Conference on Sampling Theory and Applications, SampTA 2023 - New Haven, United States
Duration: 10 Jul 202314 Jul 2023

Publication series

Name2023 International Conference on Sampling Theory and Applications, SampTA 2023

Conference

Conference2023 International Conference on Sampling Theory and Applications, SampTA 2023
Country/TerritoryUnited States
CityNew Haven
Period10/07/2314/07/23

Keywords

  • Continuum Limits
  • Geometric Deep Learning
  • Manifold Learning

Fingerprint

Dive into the research topics of 'A Convergence Rate for Manifold Neural Networks'. Together they form a unique fingerprint.

Cite this