All Papers
Title |
---|
Title |
---|
Processing data on multiple interacting graphs is crucial for many applications, but existing approaches rely mostly on discrete filtering or first-order continuous models that dampen high frequencies and propagate information slowly. We introduce second-order tensorial partial differential equations on graphs (So-TPDEGs) and propose the first theoretically grounded framework for second-order continuous product graph neural networks. Our method exploits the separability of cosine kernels in Cartesian product graphs to enable efficient spectral decomposition while preserving high-frequency signals. We further provide rigorous analyses of stability under graph perturbations and over-smoothing, establishing a solid theoretical foundation for continuous graph learning.
View on arXiv