Past Recording
ShareSave
Inductive Representation Learning on Temporal Graphs
Thursday Nov 5 2020 17:00 GMT
Please to join the live chat.
Inductive Representation Learning on Temporal Graphs
Why This Is Interesting

Inductive representation learning on temporal graphs is an important step toward salable machine learning on real-world dynamic networks. The evolving nature of temporal dynamic graphs requires handling new nodes as well as capturing temporal patterns. The node embeddings, which are now functions of time, should represent both the static node features and the evolving topological structures. Moreover, node and topological features can be temporal as well, whose patterns the node embeddings should also capture. The authors propose the temporal graph attention (TGAT) layer to efficiently aggregate temporal-topological neighborhood features as well as to learn the time-feature interactions. For TGAT, they use the self-attention mechanism as building block and develop a novel functional time encoding technique based on the classical Bochner’s theorem from harmonic analysis. By stacking TGAT layers, the network recognizes the node embeddings as functions of time and is able to inductively infer embeddings for both new and observed nodes as the graph evolves. The proposed approach handles both node classification and link prediction task, and can be naturally extended to include the temporal edge features. They evaluate their method with transductive and inductive tasks under temporal settings with two benchmark and one industrial dataset. The TGAT model compares favorably to state-of-the-art baselines as well as the previous temporal graph embedding approaches.

Discussion Points
  • Modelling temporal graph neural networks with a time encoding function
  • Motivation behind using Bochner’s theorem for temporal encoding
  • Incorporating temporal embeddings for any GNN layer
Time of Recording: Thursday Nov 5 2020 17:00 GMT