Spectral GNN

Spectral Graph Neural Networks

  • Spectral Graph Neural Networks (Spectral GNNs) are a class of neural networks designed to operate on graph-structured data, leveraging spectral methods to analyze the properties of graphs. They employ the graph Laplacian matrix, which encapsulates the structure of the graph, and use its eigenvalues and eigenvectors to process signals on the graph’s nodes and edges. By transforming graph data into the spectral domain, Spectral GNNs can efficiently capture global and local graph patterns. This page is a collection of our work on SpectralGNNs.

    Evolution of Spectral Graph Neural Networks

    Spectral GNNs often involve computationally expensive operations like eigen-decomposition, which limits scalability to large graphs. However, recent advancements have focused on improving efficiency by approximating these operations, making Spectral GNNs applicable to real-world tasks such as node classification, link prediction, and graph clustering. These models are particularly effective in applications where the structural information of the graph plays a critical role, including social networks, recommendation systems, and bioinformatics.

  • Spectral Heterogeneous Graph Convolutions via Positive Noncommutative Polynomials (He et al., 2024) [WWW 2024]

    arxiv stars stars

    Citation

    @inproceedings{DBLP:conf/www/HeWFHLSY24,
      author       = {Mingguo He and
                      Zhewei Wei and
                      Shikun Feng and
                      Zhengjie Huang and
                      Weibin Li and
                      Yu Sun and
                      Dianhai Yu},
      editor       = {Tat{-}Seng Chua and
                      Chong{-}Wah Ngo and
                      Ravi Kumar and
                      Hady W. Lauw and
                      Roy Ka{-}Wei Lee},
      title        = {Spectral Heterogeneous Graph Convolutions via Positive Noncommutative
                      Polynomials},
      booktitle    = {Proceedings of the {ACM} on Web Conference 2024, {WWW} 2024, Singapore,
                      May 13-17, 2024},
      pages        = {685--696},
      publisher    = ,
      year         = {2024},
      url          = {https://doi.org/10.1145/3589334.3645515},
      doi          = {10.1145/3589334.3645515},
    }
    
  • Graph Neural Networks with Learnable and Optimal Polynomial Bases (Guo & Wei*, 2023) [ICML 2023]

    arxiv stars stars

    Citation

    @inproceedings{DBLP:conf/icml/GuoW23,
      author       = {Yuhe Guo and
                      Zhewei Wei},
      editor       = {Andreas Krause and
                      Emma Brunskill and
                      Kyunghyun Cho and
                      Barbara Engelhardt and
                      Sivan Sabato and
                      Jonathan Scarlett},
      title        = {Graph Neural Networks with Learnable and Optimal Polynomial Bases},
      booktitle    = {International Conference on Machine Learning, {ICML} 2023, 23-29 July
                      2023, Honolulu, Hawaii, {USA}},
      series       = {Proceedings of Machine Learning Research},
      volume       = {202},
      pages        = {12077--12097},
      publisher    = ,
      year         = {2023},
      url          = {https://proceedings.mlr.press/v202/guo23i.html},
    }
    
  • Clenshaw Graph Neural Networks (Guo & Wei*, 2023) [KDD 2023]

    arxiv stars stars

    Citation

    @inproceedings{DBLP:conf/kdd/GuoW23,
      author       = {Yuhe Guo and
                      Zhewei Wei},
      editor       = {Ambuj K. Singh and
                      Yizhou Sun and
                      Leman Akoglu and
                      Dimitrios Gunopulos and
                      Xifeng Yan and
                      Ravi Kumar and
                      Fatma Ozcan and
                      Jieping Ye},
      title        = {Clenshaw Graph Neural Networks},
      booktitle    = {Proceedings of the 29th {ACM} {SIGKDD} Conference on Knowledge Discovery
                      and Data Mining, {KDD} 2023, Long Beach, CA, USA, August 6-10, 2023},
      pages        = {614--625},
      publisher    = ,
      year         = {2023},
      url          = {https://doi.org/10.1145/3580305.3599275},
      doi          = {10.1145/3580305.3599275},
    }
    
  • EvenNet: Ignoring Odd-Hop Neighbors Improves Robustness of Graph Neural Networks (Lei et al., 2022) [NeurIPS 2022]

    arxiv stars stars

    Citation

    @inproceedings{Lei2022evennet,
      title={EvenNet: Ignoring Odd-Hop Neighbors Improves Robustness of Graph Neural Networks},
      author={Lei, Runlin and Wang, Zhen and Li, Yaliang and Ding, Bolin and Wei, Zhewei},
      booktitle={NeurIPS},
      year={2022}
    }
    
  • ChebNetII: Learning Arbitrary Graph Spectral Filters via Bernstein Approximation (He et al., 2022) [NeurIPS 2022]

    arxiv stars stars

    Citation

    @inproceedings{he2022chebnetii,
      title={Convolutional Neural Networks on Graphs with Chebyshev Approximation, Revisited},
      author={He, Mingguo and Wei, Zhewei and Wen, Ji-Rong},
      booktitle={NeurIPS},
      year={2022}
    }
    
  • BernNet: Learning Arbitrary Graph Spectral Filters via Bernstein Approximation (He et al., 2021) [NeurIPS 2021]

    arxiv stars stars

    Citation

    @inproceedings{he2021bernnet,
      title={BernNet: Learning Arbitrary Graph Spectral Filters via Bernstein Approximation},
      author={He, Mingguo and Wei, Zhewei and Huang, Zengfeng and Xu, Hongteng},
      booktitle={NeurIPS},
      year={2021}
    }