TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/Efficient Heterogeneous Graph Learning via Random Projection

Efficient Heterogeneous Graph Learning via Random Projection

Jun Hu, Bryan Hooi, Bingsheng He

2023-10-23Heterogeneous Node ClassificationGraph LearningNode Property Prediction
PaperPDFCode(official)

Abstract

Heterogeneous Graph Neural Networks (HGNNs) are powerful tools for deep learning on heterogeneous graphs. Typical HGNNs require repetitive message passing during training, limiting efficiency for large-scale real-world graphs. Recent pre-computation-based HGNNs use one-time message passing to transform a heterogeneous graph into regular-shaped tensors, enabling efficient mini-batch training. Existing pre-computation-based HGNNs can be mainly categorized into two styles, which differ in how much information loss is allowed and efficiency. We propose a hybrid pre-computation-based HGNN, named Random Projection Heterogeneous Graph Neural Network (RpHGNN), which combines the benefits of one style's efficiency with the low information loss of the other style. To achieve efficiency, the main framework of RpHGNN consists of propagate-then-update iterations, where we introduce a Random Projection Squashing step to ensure that complexity increases only linearly. To achieve low information loss, we introduce a Relation-wise Neighbor Collection component with an Even-odd Propagation Scheme, which aims to collect information from neighbors in a finer-grained way. Experimental results indicate that our approach achieves state-of-the-art results on seven small and large benchmark datasets while also being 230% faster compared to the most effective baseline. Surprisingly, our approach not only surpasses pre-processing-based baselines but also outperforms end-to-end methods.

Results

TaskDatasetMetricValueModel
Node ClassificationIMDB (Heterogeneous Node Classification) Macro-F167.53RpHGNN
Node ClassificationIMDB (Heterogeneous Node Classification)Micro-F169.77RpHGNN
Node ClassificationFreebase (Heterogeneous Node Classification) Macro-F154.02RpHGNN
Node ClassificationFreebase (Heterogeneous Node Classification)Micro-F166.55RpHGNN
Node ClassificationDBLP (Heterogeneous Node Classification) Macro-F195.23RpHGNN
Node ClassificationDBLP (Heterogeneous Node Classification)Micro-F195.55RpHGNN
Node ClassificationACM (Heterogeneous Node Classification) Macro-F194.09RpHGNN
Node ClassificationACM (Heterogeneous Node Classification)Micro-F194.04RpHGNN
Node ClassificationOAG-VenueMRR35.46RpHGNN
Node ClassificationOAG-VenueNDCG53.31RpHGNN
Node ClassificationOAG-L1-FieldMRR86.79RpHGNN
Node ClassificationOAG-L1-FieldNDCG87.8RpHGNN
Node Property Predictionogbn-magNumber of params7720368RpHGNN+LP+CR (LINE embs)

Related Papers

SGCL: Unifying Self-Supervised and Supervised Learning for Graph Recommendation2025-07-17A Graph-in-Graph Learning Framework for Drug-Target Interaction Prediction2025-07-15Graph World Model2025-07-14Federated Learning with Graph-Based Aggregation for Traffic Forecasting2025-07-13Graph Learning2025-07-08GDGB: A Benchmark for Generative Dynamic Text-Attributed Graph Learning2025-07-04S2FGL: Spatial Spectral Federated Graph Learning2025-07-03Interpretable Hierarchical Concept Reasoning through Attention-Guided Graph Learning2025-06-26