T-HyperGNNs: Hypergraph Neural Networks Via Tensor Representations
Hypergraph neural networks (HyperGNNs) are a family of deep neural networks designed to perform inference on hypergraphs. HyperGNNs follow either a spectral or a spatial approach, in which a convolution or message-passing operation is conducted based on a hypergraph algebraic descriptor. While many HyperGNNs have been proposed and achieved state-of- the-art performance on broad applications, there have been limited attempts at exploring high dimensional hypergraph descriptors (tensors) and joint node interactions carried by hyperedges. In this paper, we depart from hypergraph matrix representations and present a new tensor-HyperGNN framework (T-HyperGNN) with cross-node interactions. The T-HyperGNN framework consists of T-spectral convolution, T-spatial convo- lution, and T-message-passing HyperGNNs (T-MPHN). The T- spectral convolution HyperGNN is defined under the t-product algebra that closely connects to the spectral space. To im- prove computational efficiency for large hypergraphs, we localize the T-spectral convolution approach to formulate the T-spatial convolution and further devise a novel tensor message-passing algorithm for practical implementation by studying a compressed adjacency tensor representation. Compared to the state-of-the- art approaches, our T-HyperGNNs preserve intrinsic high-order network structures without any hypergraph reduction and model the joint effects of nodes through a cross-node interaction layer. These advantages of our T-HyperGNNs are demonstrated in a wide range of real-world hypergraph datasets.
Funding
Air Force Office of Scientific Research
JP Morgan Chase & Co.
History
Email Address of Submitting Author
fuliwang@udel.eduORCID of Submitting Author
0000-0002-7062-9432Submitting Author's Institution
University of DelawareSubmitting Author's Country
- United States of America