Published September 3, 2025 | Version v1
Journal article Open

SuperHyperGraph Attention Networks

  • 1. ndependent Researcher, Tokyo, Japan

Description

Graph Attention Networks (GAT) employ self-attention to aggregate neighboring node features in
graphs, effectively capturing structural dependencies. HyperGraph Attention Networks (HGAT) extend this
mechanism to hypergraphs by alternating attention-based vertex-to-hyperedge and hyperedge-to-vertex up-
dates, modeling higher-order relationships. In this work, we introduce the n-SuperHyperGraph Attention Net-
work, which leverages SuperHyperGraphs—a hierarchical generalization of hypergraphs—to perform multi-tier
attention among supervertices and superedges. Our investigation is purely theoretical; empirical validation via
computational experiments is left for future study

Files

2-10-27SuperHyperGraphAttentionNetworks.pdf

Files (367.5 kB)

Name Size Download all
md5:61b9fb5a5b6565cb781fa37a090ecfb2
367.5 kB Preview Download