Recurrent neural networks (RNNs) have become increasingly popular in information processing tasks involving time series and temporal data. A fundamental property of RNNs is their ability to create reliable input/output responses, often linked to how the network handles its memory of the information it processed. Various notions have been proposed to conceptualize the behavior of memory in RNNs, including steady states, echo states, state forgetting, input forgetting, and fading memory. Although these notions are often used interchangeably, their precise relationships remain unclear. This work aims to unify these notions in a common language, derive new implications and equivalences between them, and provide alternative proofs to some existing results. By clarifying the relationships between these concepts, this research contributes to a deeper understanding of RNNs and their temporal information processing capabilities.
翻译:循环神经网络(RNNs)在处理涉及时间序列和时序数据的信息处理任务中日益普及。RNNs的一个基本特性是它们能够产生可靠的输入/输出响应,这通常与网络如何处理其已处理信息的记忆有关。已有多种概念被提出来描述RNNs中记忆的行为,包括稳态、回声状态、状态遗忘、输入遗忘和衰减记忆。尽管这些概念常被互换使用,但它们之间的确切关系仍不明确。本研究旨在用统一的语言整合这些概念,推导它们之间的新蕴含关系与等价性,并为一些现有结果提供替代证明。通过厘清这些概念之间的关系,本研究有助于更深入地理解RNNs及其时序信息处理能力。