Spectral GNN
Spectral Graph Neural Networks
-
Spectral Graph Neural Networks (Spectral GNNs) are a class of neural networks designed to operate on graph-structured data, leveraging spectral methods to analyze the properties of graphs. They employ the graph Laplacian matrix, which encapsulates the structure of the graph, and use its eigenvalues and eigenvectors to process signals on the graph’s nodes and edges. By transforming graph data into the spectral domain, Spectral GNNs can efficiently capture global and local graph patterns. This page is a collection of our work on SpectralGNNs.
Evolution of Spectral Graph Neural NetworksSpectral GNNs often involve computationally expensive operations like eigen-decomposition, which limits scalability to large graphs. However, recent advancements have focused on improving efficiency by approximating these operations, making Spectral GNNs applicable to real-world tasks such as node classification, link prediction, and graph clustering. These models are particularly effective in applications where the structural information of the graph plays a critical role, including social networks, recommendation systems, and bioinformatics.
-
Spectral Heterogeneous Graph Convolutions via Positive Noncommutative Polynomials (He et al., 2024) [WWW 2024]
-
Spectral Heterogeneous Graph Convolutions via Positive Noncommutative PolynomialsThe Web Conference (WWW), 2024. (Oral)
-
Citation
@inproceedings{DBLP:conf/www/HeWFHLSY24, author = {Mingguo He and Zhewei Wei and Shikun Feng and Zhengjie Huang and Weibin Li and Yu Sun and Dianhai Yu}, editor = {Tat-Seng Chua and Chong-Wah Ngo and Ravi Kumar and Hady W. Lauw and Roy Ka-Wei Lee}, title = {Spectral Heterogeneous Graph Convolutions via Positive Noncommutative Polynomials}, booktitle = {Proceedings of the ACM on Web Conference 2024, WWW 2024, Singapore, May 13-17, 2024}, pages = {685--696}, publisher = {ACM}, year = {2024}, url = {https://doi.org/10.1145/3589334.3645515}, doi = {10.1145/3589334.3645515}, }
-
Graph Neural Networks with Learnable and Optimal Polynomial Bases (Guo & Wei*, 2023) [ICML 2023]
-
Graph Neural Networks with Learnable and Optimal Polynomial BasesInternational Conference on Machine Learning (ICML), 2023
-
Citation
@inproceedings{DBLP:conf/icml/GuoW23, author = {Yuhe Guo and Zhewei Wei}, editor = {Andreas Krause and Emma Brunskill and Kyunghyun Cho and Barbara Engelhardt and Sivan Sabato and Jonathan Scarlett}, title = {Graph Neural Networks with Learnable and Optimal Polynomial Bases}, booktitle = {International Conference on Machine Learning, ICML 2023, 23-29 July 2023, Honolulu, Hawaii, USA}, series = {Proceedings of Machine Learning Research}, volume = {202}, pages = {12077--12097}, publisher = {PMLR}, year = {2023}, url = {https://proceedings.mlr.press/v202/guo23i.html}, }
-
Clenshaw Graph Neural Networks (Guo & Wei*, 2023) [KDD 2023]
-
Clenshaw Graph Neural NetworksACM SIGKDD Conference on Knowledge Discovery and Data Mining (KDD), 2023
-
Citation
@inproceedings{DBLP:conf/kdd/GuoW23, author = {Yuhe Guo and Zhewei Wei}, editor = {Ambuj K. Singh and Yizhou Sun and Leman Akoglu and Dimitrios Gunopulos and Xifeng Yan and Ravi Kumar and Fatma Ozcan and Jieping Ye}, title = {Clenshaw Graph Neural Networks}, booktitle = {Proceedings of the 29th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, KDD 2023, Long Beach, CA, USA, August 6-10, 2023}, pages = {614--625}, publisher = {ACM}, year = {2023}, url = {https://doi.org/10.1145/3580305.3599275}, doi = {10.1145/3580305.3599275}, }
-
EvenNet: Ignoring Odd-Hop Neighbors Improves Robustness of Graph Neural Networks (Lei et al., 2022) [NeurIPS 2022]
-
Evennet: Ignoring Odd-Hop Neighbors Improves Robustness of Graph Neural NetworksAnnual Conference on Neural Information Processing Systems (NeurIPS), 2022
-
Citation
@inproceedings{Lei2022evennet, title={EvenNet: Ignoring Odd-Hop Neighbors Improves Robustness of Graph Neural Networks}, author={Lei, Runlin and Wang, Zhen and Li, Yaliang and Ding, Bolin and Wei, Zhewei}, booktitle={NeurIPS}, year={2022} }
-
ChebNetII: Learning Arbitrary Graph Spectral Filters via Bernstein Approximation (He et al., 2022) [NeurIPS 2022]
-
Convolutional Neural Networks On Graphs With Chebyshev Approximation, RevisitedAnnual Conference on Neural Information Processing Systems (NeurIPS), 2022. (Oral)
-
Citation
@inproceedings{he2022chebnetii, title={Convolutional Neural Networks on Graphs with Chebyshev Approximation, Revisited}, author={He, Mingguo and Wei, Zhewei and Wen, Ji-Rong}, booktitle={NeurIPS}, year={2022} }
-
BernNet: Learning Arbitrary Graph Spectral Filters via Bernstein Approximation (He et al., 2021) [NeurIPS 2021]
-
BernNet: Learning Arbitrary Graph Spectral Filters via Bernstein ApproximationAnnual Conference on Neural Information Processing Systems (NeurIPS), 2021
-
Citation
@inproceedings{he2021bernnet, title={BernNet: Learning Arbitrary Graph Spectral Filters via Bernstein Approximation}, author={He, Mingguo and Wei, Zhewei and Huang, Zengfeng and Xu, Hongteng}, booktitle={NeurIPS}, year={2021} }