The ability of reasoning beyond data fitting is substantial to deep learning systems in order to make a leap forward towards artificial general intelligence. A lot of efforts have been made to model neural-based reasoning as an iterative decision-making process based on recurrent networks and reinforcement learning. Instead, inspired by the consciousness prior proposed by Yoshua Bengio, we explore reasoning with the notion of attentive awareness from a cognitive perspective, and formulate it in the form of attentive message passing on graphs, called neural consciousness flow (NeuCFlow). Aiming to bridge the gap between deep learning systems and reasoning, we propose an attentive computation framework with a three-layer architecture, which consists of an unconsciousness flow layer, a consciousness flow layer, and an attention flow layer. We implement the NeuCFlow model with graph neural networks (GNNs) and conditional transition matrices. Our attentive computation greatly reduces the complexity of vanilla GNN-based methods, capable of running on large-scale graphs. We validate our model for knowledge graph reasoning by solving a series of knowledge base completion (KBC) tasks. The experimental results show NeuCFlow significantly outperforms previous state-of-the-art KBC methods, including the embedding-based and the path-based. The reproducible code can be found by the link below.
翻译:超越数据安装的推理能力对于深层次的学习系统来说是巨大的,以便向人工一般智能飞跃前进。我们已作出大量努力,将神经推理作为建立在经常性网络和强化学习基础上的迭代决策进程。相反,在Yoshua Bengio先前提出的认识的启发下,我们探索从认知角度关注认识概念的推理,并以在图表上传递专注信息的形式,即所谓的神经意识流(NeuCFlow)来制定这一推理。为了弥合深层次学习系统和推理之间的差距,我们提议了一个三层结构的仔细计算框架,其中包括一个无意识流层、意识流层和关注流层。我们用图形神经网络和有条件的过渡矩阵来实施NeuCFlow模型。我们仔细计算会大大降低Vanilla GNNN方法的复杂性,能够以大型图表运行。我们通过解决一系列知识基础完成(KBC)任务来验证我们的知识图表推理模型。实验结果显示NeCFCFlow明显超越了先前的状态,而基于KNBC的路径,包括基于KBC的嵌入路径。