TY - GEN
T1 - A Convergence Rate for Manifold Neural Networks
AU - Chew, Joyce A.
AU - Needell, Deanna
AU - Perlmutter, Michael
N1 - Publisher Copyright:
© 2023 IEEE.
PY - 2023
Y1 - 2023
N2 - High-dimensional data arises in numerous applications, and the rapidly developing field of geometric deep learning seeks to develop neural network architectures to analyze such data in non-Euclidean domains, such as graphs and manifolds. Recent work has proposed a method for constructing manifold neural networks using the spectral decomposition of the Laplace-Beltrami operator. Moreover, in this work, the authors provide a numerical scheme for implementing such neural networks when the manifold is unknown and one only has access to finitely many sample points. They show that this scheme, which relies upon building a data-driven graph, converges to the continuum limit as the number of sample points tends to infinity. Here, we build upon this result by establishing a rate of convergence that depends on the intrinsic dimension of the manifold but is independent of the ambient dimension. We also discuss how the rate of convergence depends on the depth of the network and the number of filters used in each layer.
AB - High-dimensional data arises in numerous applications, and the rapidly developing field of geometric deep learning seeks to develop neural network architectures to analyze such data in non-Euclidean domains, such as graphs and manifolds. Recent work has proposed a method for constructing manifold neural networks using the spectral decomposition of the Laplace-Beltrami operator. Moreover, in this work, the authors provide a numerical scheme for implementing such neural networks when the manifold is unknown and one only has access to finitely many sample points. They show that this scheme, which relies upon building a data-driven graph, converges to the continuum limit as the number of sample points tends to infinity. Here, we build upon this result by establishing a rate of convergence that depends on the intrinsic dimension of the manifold but is independent of the ambient dimension. We also discuss how the rate of convergence depends on the depth of the network and the number of filters used in each layer.
KW - Continuum Limits
KW - Geometric Deep Learning
KW - Manifold Learning
UR - http://www.scopus.com/inward/record.url?scp=85178509484&partnerID=8YFLogxK
U2 - 10.1109/SampTA59647.2023.10301407
DO - 10.1109/SampTA59647.2023.10301407
M3 - Conference contribution
AN - SCOPUS:85178509484
T3 - 2023 International Conference on Sampling Theory and Applications, SampTA 2023
BT - 2023 International Conference on Sampling Theory and Applications, SampTA 2023
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2023 International Conference on Sampling Theory and Applications, SampTA 2023
Y2 - 10 July 2023 through 14 July 2023
ER -