Downloads provided by UsageCounts
In existing literature, GNN training has been performed mostly in centralized, and sometimes federated, settings. In this work, we consider a fully-decentralized data-private scenario, where each node has limited knowledge of the surrounding graph. We propose the first architecture that enables GNN training in this fully-decentralized setting, by carefully combining several techniques, including decoupled learning, self-supervision and Gossip Learning. We implement two simulation tools to experimentally evaluate our solution. The results show that the proposed technique can be effectively used in scenarios where centralized or federated approaches are unfeasible or undesirable.
Decoupled Learning, Decentralized Learning, Self-Supervised Learning, Graph Neural Networks, Gossip Learning
Decoupled Learning, Decentralized Learning, Self-Supervised Learning, Graph Neural Networks, Gossip Learning
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
| views | 35 | |
| downloads | 21 |

Views provided by UsageCounts
Downloads provided by UsageCounts