Home Publications edited volumes Awards Research Teaching Miscellaneous Full CV [pdf] BLOG
Events
Past Events
|
Publications of Torsten Hoefler
Julia Bazinska, Andrei Ivanov, Tal Ben-Nun, Nikoli Dryden, Maciej Besta, Siyuan Shen, Torsten Hoefler:
| | Cached Operator Reordering: A Unified View for Fast GNN Training
(arXiv:2308.12093. Aug. 2023)
AbstractGraph Neural Networks (GNNs) are a powerful tool for handling structured graph data and addressing tasks such as node classification, graph classification, and clustering. However, the sparse nature of GNN computation poses new challenges for performance optimization compared to traditional deep neural networks. We address these challenges by providing a unified view of GNN computation, I/O, and memory. By analyzing the computational graphs of the Graph Convolutional Network (GCN) and Graph Attention (GAT) layers - two widely used GNN layers - we propose alternative computation strategies. We present adaptive operator reordering with caching, which achieves a speedup of up to 2.43x for GCN compared to the current state-of-the-art. Furthermore, an exploration of different caching schemes for GAT yields a speedup of up to 1.94x. The proposed optimizations save memory, are easily implemented across various hardware platforms, and have the potential to alleviate performance bottlenecks in training large-scale GNN models.
Documentsdownload article:
| | BibTeX | @article{bazinska2023cached, author={Julia Bazinska and Andrei Ivanov and Tal Ben-Nun and Nikoli Dryden and Maciej Besta and Siyuan Shen and Torsten Hoefler}, title={{Cached Operator Reordering: A Unified View for Fast GNN Training}}, journal={arXiv:2308.12093}, year={2023}, month={Aug.}, source={http://www.unixer.de/~htor/publications/}, } |
|
|