idely used recurrent units, including Long-short Term Memory (LSTM) and the Gated Recurrent Unit (GRU), perform well on natural language tasks, but their ability to learn structured representations is still questionable. Exploiting reduced Tensor Product Representations (TPRs) --- distributed representations of symbolic structure in which vector-embedded symbols are bound to vector-embedded structural positions --- we propose the TPRU, a simple recurrent unit that, at each time step, explicitly executes structural-role binding and unbinding operations to incorporate structural information into learning. A gradient analysis of our proposed TPRU is conducted to support our model design, and its performance on multiple datasets shows the effectiveness of our design choices. Furthermore, observations on a linguistically grounded study demonstrate the interpretability of our TPRU.
翻译:包括长期短期内存(LSTM)和Ged 经常股(GRU)在内的常用的经常性单位在自然语言任务方面表现良好,但是它们学习结构化表述的能力仍然值得怀疑。利用减少的Tensor产品表示(TPRs) -- -- 分布式符号结构,其中矢量组成的符号与矢量组成的结构位置有关 -- -- 我们提议TPRU,这是一个简单的经常性单位,每个步骤都明确执行结构功能约束性和无约束力的行动,以便将结构性信息纳入学习。我们对拟议的TPRU进行了梯度分析,以支持我们的模型设计,它在多个数据集上的性能显示了我们设计选择的有效性。此外,对基于语言的研究进行观察,表明我们的TPRU的可解释性。