loading page

T-HyperGNNs: Hypergraph Neural Networks Via Tensor Representations
  • +1
  • Fuli Wang ,
  • Karelia Pena-Pena ,
  • Wei Qian ,
  • Gonzalo Arce
Fuli Wang
University of Delaware

Corresponding Author:[email protected]

Author Profile
Karelia Pena-Pena
Author Profile
Gonzalo Arce
Author Profile

Abstract

Hypergraph neural networks (HyperGNNs) are a family of deep neural networks designed to perform inference on hypergraphs. HyperGNNs follow either a spectral or a spatial approach, in which a convolution or message-passing operation is conducted based on a hypergraph algebraic descriptor. While many HyperGNNs have been proposed and achieved state-of- the-art performance on broad applications, there have been limited attempts at exploring high dimensional hypergraph descriptors (tensors) and joint node interactions carried by hyperedges. In this paper, we depart from hypergraph matrix representations and present a new tensor-HyperGNN framework (T-HyperGNN) with cross-node interactions. The T-HyperGNN framework consists of T-spectral convolution, T-spatial convo- lution, and T-message-passing HyperGNNs (T-MPHN). The T- spectral convolution HyperGNN is defined under the t-product algebra that closely connects to the spectral space. To im- prove computational efficiency for large hypergraphs, we localize the T-spectral convolution approach to formulate the T-spatial convolution and further devise a novel tensor message-passing algorithm for practical implementation by studying a compressed adjacency tensor representation. Compared to the state-of-the- art approaches, our T-HyperGNNs preserve intrinsic high-order network structures without any hypergraph reduction and model the joint effects of nodes through a cross-node interaction layer. These advantages of our T-HyperGNNs are demonstrated in a wide range of real-world hypergraph datasets.