TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/An Attention-based Graph Neural Network for Heterogeneous ...

An Attention-based Graph Neural Network for Heterogeneous Structural Learning

Huiting Hong, Hantao Guo, Yu-Cheng Lin, Xiaoqing Yang, Zang Li, Jieping Ye

2019-12-19Graph Representation LearningRepresentation LearningHeterogeneous Node ClassificationMulti-Task LearningNode ClassificationGraph Embedding
PaperPDFCode(official)

Abstract

In this paper, we focus on graph representation learning of heterogeneous information network (HIN), in which various types of vertices are connected by various types of relations. Most of the existing methods conducted on HIN revise homogeneous graph embedding models via meta-paths to learn low-dimensional vector space of HIN. In this paper, we propose a novel Heterogeneous Graph Structural Attention Neural Network (HetSANN) to directly encode structural information of HIN without meta-path and achieve more informative representations. With this method, domain experts will not be needed to design meta-path schemes and the heterogeneous information can be processed automatically by our proposed model. Specifically, we implicitly represent heterogeneous information using the following two methods: 1) we model the transformation between heterogeneous vertices through a projection in low-dimensional entity spaces; 2) afterwards, we apply the graph neural network to aggregate multi-relational information of projected neighborhood by means of attention mechanism. We also present three extensions of HetSANN, i.e., voices-sharing product attention for the pairwise relationships in HIN, cycle-consistency loss to retain the transformation between heterogeneous entity spaces, and multi-task learning with full use of information. The experiments conducted on three public datasets demonstrate that our proposed models achieve significant and consistent improvements compared to state-of-the-art solutions.

Results

TaskDatasetMetricValueModel
Node ClassificationIMDB (Heterogeneous Node Classification) Macro-F149.47HetSANN
Node ClassificationIMDB (Heterogeneous Node Classification)Micro-F157.68HetSANN
Node ClassificationDBLP (Heterogeneous Node Classification) Macro-F178.55HetSANN
Node ClassificationDBLP (Heterogeneous Node Classification)Micro-F180.56HetSANN
Node ClassificationACM (Heterogeneous Node Classification) Macro-F190.02HetSANN
Node ClassificationACM (Heterogeneous Node Classification)Micro-F189.91HetSANN

Related Papers

Touch in the Wild: Learning Fine-Grained Manipulation with a Portable Visuo-Tactile Gripper2025-07-20SMART: Relation-Aware Learning of Geometric Representations for Knowledge Graphs2025-07-17Spectral Bellman Method: Unifying Representation and Exploration in RL2025-07-17Boosting Team Modeling through Tempo-Relational Representation Learning2025-07-17SGCL: Unifying Self-Supervised and Supervised Learning for Graph Recommendation2025-07-17Similarity-Guided Diffusion for Contrastive Sequential Recommendation2025-07-16Are encoders able to learn landmarkers for warm-starting of Hyperparameter Optimization?2025-07-16Language-Guided Contrastive Audio-Visual Masked Autoencoder with Automatically Generated Audio-Visual-Text Triplets from Videos2025-07-16