
handle: 2123/32417
Various real-world graphs grow with time, necessitating the research of continual graph learning (CGL), which aims to accommodate new tasks over newly emerged graph data while maintaining the model performance over existing tasks. First, we study the CGL task configurations in different application scenarios and develop a comprehensive Continual Graph Learning Benchmark (CGLB). CGLB contains comprehensive CGL tasks under various experimental settings, as well as a toolkit for developing CGL techniques. Second, we developed a series of CGL techniques: 1) Hierarchical Prototype Networks (HPNs), 2) Sparsified Subgraph Memory (SSM), and 3) Subgraph Episodic Memory (SEM). Hierarchical Prototype Networks (HPNs) is designed to extract basic shareable features and store them into prototypes. In this way, the forgetting problem can be alleviated by knowledge sharing and independently updated prototypes. Next, SSM is a memory-replay based CGL technique, which stores a set of representative historical data from previous tasks to replay while learning new tasks. While topological information is critical in characterizing graph data, existing memory replay based CGL techniques only store individual nodes for replay and do not consider the topological information due to the memory explosion problem. To this end, SSM is designed to sparsify the selected computation graphs into fixed size before storing them into the memory. In this way, we can significantly reduce the memory consumption of a computation subgraph, and for the first time enable GNNs to utilize the explicit topological information for memory replay. Based on SSM, we developed the SEM, which adopts graph Ricci-curvature as the criteria during the computation subgraph sparsification. Finally, in experiments, we study various real-world graph data including social network, citation network, product co-purchasing network, scene graph, and molecule graphs.
Continual Learning, Lifelong Graph Learning, Continual Graph Learning, Graph Representation Learning, 004
Continual Learning, Lifelong Graph Learning, Continual Graph Learning, Graph Representation Learning, 004
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
