Dynamic graphs arise in a plethora of practical scenarios such as social networks, communication networks, and financial transaction networks. Given a dynamic graph, it is fundamental and essential to learn a graph representation that is expected not only to preserve structural proximity but also jointly capture the time-evolving patterns. Recently, graph convolutional network (GCN) has been widely explored and used in non-Euclidean application domains. The main success of GCN, especially in handling dependencies and passing messages within nodes, lies in its approximation to Laplacian smoothing. As a matter of fact, this smoothing technique can not only encourage must-link node pairs to get closer but also push cannot-link pairs to shrink together, which potentially cause serious feature shrink or oversmoothing problem, especially when stacking graph convolution in multiple layers or steps. For learning time-evolving patterns, a natural solution is to preserve historical state and combine it with the current interactions to obtain the most recent representation. Then the serious feature shrink or oversmoothing problem could happen when stacking graph convolution explicitly or implicitly according to current prevalent methods, which would make nodes too similar to distinguish each other. To solve this problem in dynamic graph embedding, we analyze the shrinking properties in the node embedding space at first, and then design a simple yet versatile method, which exploits L2 feature normalization constraint to rescale all nodes to hypersphere of a unit ball so that nodes would not shrink together, and yet similar nodes can still get closer. Extensive experiments on four real-world dynamic graph datasets compared with competitive baseline models demonstrate the effectiveness of the proposed method.
翻译:动态图形出现在大量实际的情景中, 如社交网络、通信网络和金融交易网络。 在动态图表中, 光滑技术不仅可以鼓励必须连接的节点对配方更接近, 还可以推动无法连接的对配方一起缩缩缩, 这可能会造成严重的特征收缩或超缩问题, 特别是当图动网络在多个层次或步骤中堆叠时。 对于学习时间变化模式来说, 一个自然解决方案是保存历史状态, 并且将它与当前互动结合起来, 以获得最新的表达方式。 然后, 严重特征会缩小或者过度缩小问题可能会发生, 而不是直接或隐性地将图表组合对配对对对进行更接近, 并且推动无法连结对配对, 从而共同缩缩缩在一起, 这可能会造成严重的特征缩小或超缩, 特别是当图形在多个层次或步骤中堆叠时。 对于时间变化模式, 一个自然解决方案是保存历史状态, 并且将它和当前的互动结合起来。 然后, 严重特征会缩小或超缩的问题会发生于 。 当重的图形变缩时, 与当前流行的模型相比, 不断变缩的精确的模型 将显示我们 的模型 将无法再显示一个更接近于一个更精确的模型。