Scaling R-GCN Training with Graph Summarization

Alessandro Generale, Till Blume, Michael Cochez

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

4 Scopus citations

Abstract

Training of Relational Graph Convolutional Networks (R-GCN) is a memory intense task. The amount of gradient information that needs to be stored during training for real-world graphs is often too large for the amount of memory available on most GPUs. In this work, we experiment with the use of graph summarization techniques to compress the graph and hence reduce the amount of memory needed. After training the R-GCN on the graph summary, we transfer the weights back to the original graph and attempt to perform inference on it. We obtain reasonable results on the AIFB, MUTAG and AM datasets. Our experiments show that training on the graph summary can yield a comparable or higher accuracy to training on the original graphs. Furthermore, if we take the time to compute the summary out of the equation, we observe that the smaller graph representations obtained with graph summarization methods reduces the computational overhead. However, further experiments are needed to evaluate additional graph summary models and whether our findings also holds true for very large graphs.

Original languageEnglish
Title of host publicationWWW '22: Companion Proceedings of the Web Conference 2022
Place of PublicationVirtual Event, Lyon, France
PublisherAssociation for Computing Machinery
Pages1073-1082
Number of pages10
ISBN (Electronic)9781450391306
ISBN (Print)978-1-4503-9130-6
DOIs
StatePublished - Apr 1 2022
Externally publishedYes
Event31st Companion of the World Wide Web Conference, WWW 2022 - Virtual, Lyon, France
Duration: Apr 25 2022 → …

Publication series

NameWWW '22
PublisherAssociation for Computing Machinery

Conference

Conference31st Companion of the World Wide Web Conference, WWW 2022
Country/TerritoryFrance
CityVirtual, Lyon
Period04/25/22 → …

Keywords

  • graph neural network
  • graph summarization
  • scalability

Fingerprint

Dive into the research topics of 'Scaling R-GCN Training with Graph Summarization'. Together they form a unique fingerprint.
  • DL: ICAI Discovery Lab

    van Harmelen, F. (CoPI), De Rijke, M. (CoI), Siebert, M. (CoI), Hoekstra, R. (CoPI), Tsatsaronis, G. (CoPI), Groth, P. (CoPI), Cochez, M. (CoI), Pernisch, R. (CoI), Alivanistos, D. (CoI), Mansoury, M. (CoI), van Hoof, H. (CoI), Pal, V. (CoI), Pijnenburg, T. (CoI), Mitra, P. (CoI), Bey, T. (CoI) & de Waard, A. (CoPI)

    10/1/1903/31/25

    Project: Research

Cite this