loading page

Learning Hypergraphs Tensor Representations from Data via t-HGSP
  • +2
  • Karelia Pena-Pena ,
  • Lucas Taipe ,
  • Fuli Wang ,
  • Daniel Lau ,
  • Gonzalo Arce
Karelia Pena-Pena
University of Delaware

Corresponding Author:[email protected]

Author Profile
Lucas Taipe
Author Profile
Fuli Wang
Author Profile
Daniel Lau
Author Profile
Gonzalo Arce
Author Profile


Representation learning considering high-order relationships in data has recently shown to be advantageous in many applications. The construction of a meaningful hypergraph plays a crucial role in the success of hypergraph-based representation learning methods, which is particularly useful in hypergraph neural networks and hypergraph signal processing. However, a meaningful hypergraph may only be available in specific cases. This paper addresses the challenge of learning the underlying hypergraph topology from the data itself. As in graph signal processing applications, we consider the case in which the data possesses certain regularity or smoothness on the hypergraph. To this end, our method builds on the novel tensor-based hypergraph signal processing framework (t-HGSP) that has recently emerged as a powerful tool for preserving the intrinsic high-order structure of data on hypergraphs. Given the hypergraph spectrum and frequency coefficient definitions within the t-HGSP framework, we propose a method to learn the hypergraph Laplacian from data by minimizing the total variation on the hypergraph (TVL-HGSP). Additionally, we introduce an alternative approach  (PDL-HGSP) that improves the connectivity of the learned hypergraph without compromising sparsity and use  primal-dual-based algorithms to reduce the computational complexity. Finally, we combine the proposed learning algorithms with novel tensor-based hypergraph convolutional neural networks to propose hypergraph learning-convolutional neural networks (t-HyperGLNN).