We examine some combinatorial properties of parallel cut elimination in multiplicative linear logic (MLL) proof nets. We show that, provided we impose a constraint on some paths, we can bound the size of all the nets satisfying this constraint and reducing to a fixed resultant net. This result gives a sufficient condition for an infinite weighted sum of nets to reduce into another sum of nets, while keeping coefficients finite. We moreover show that our constraints are stable under reduction. Our approach is motivated by the quantitative semantics of linear logic: many models have been proposed, whose structure reflect the Taylor expansion of multiplicative exponential linear logic (MELL) proof nets into infinite sums of differential nets. In order to simulate one cut elimination step in MELL, it is necessary to reduce an arbitrary number of cuts in the differential nets of its Taylor expansion. It turns out our results apply to differential nets, because their cut elimination is essentially multiplicative. We moreover show that the set of differential nets that occur in the Taylor expansion of an MELL net automatically satisfies our constraints. Interestingly, our nets are untyped: we only rely on the sequentiality of linear logic nets and the dynamics of cut elimination. The paths on which we impose bounds are the switching paths involved in the Danos--Regnier criterion for sequentiality. In order to accommodate multiplicative units and weakenings, our nets come equipped with jumps: each weakening node is connected to some other node. Our constraint can then be summed up as a bound on both the length of switching paths, and the number of weakenings that jump to a common node.


翻译:我们在多复制线性逻辑(MLL)验证网中检查平行切除平行切除的一些组合特性。我们显示,如果我们对一些路径施加限制,我们可以将满足这一限制的所有网的大小捆绑起来,将其缩小为固定的成网。这为无限的加权网数目提供了足够条件,可以将其减为另一网数目,同时保持系数的有限性。我们还表明,我们的制约因素在减少中是稳定的。我们的方法受到线性逻辑定量语义的驱动:许多模型已经提出,其结构反映了多复制的指数性线性逻辑(MELL)验证网的扩展为无限的差别网数目。为了模拟MELL的消除步骤,我们必须减少其扩大的差别网的任意削减数量。结果也适用于差别性网,因为它们的消除基本上是多倍增的。我们还表明,在泰勒扩大的MELL网中出现的差别网数目自动满足了我们的限制因素。有趣的是,我们的网型结构是非型式的:我们仅仅依靠线性逻辑性逻辑网的顺序顺序的顺序,而在每一条线性递减线性线性线性线性线性线性线上的递减线性线性线性线性线性线性调整中,而使不断递减的线性递减。

0
下载
关闭预览

相关内容

【DeepMind】强化学习教程,83页ppt
专知会员服务
148+阅读 · 2020年8月7日
神经常微分方程教程,50页ppt,A brief tutorial on Neural ODEs
专知会员服务
70+阅读 · 2020年8月2日
强化学习最新教程,17页pdf
专知会员服务
168+阅读 · 2019年10月11日
【SIGGRAPH2019】TensorFlow 2.0深度学习计算机图形学应用
专知会员服务
39+阅读 · 2019年10月9日
Hierarchically Structured Meta-learning
CreateAMind
23+阅读 · 2019年5月22日
已删除
将门创投
6+阅读 · 2019年1月2日
ACL 2018 计算语言学协会接受论文列表
专知
3+阅读 · 2018年4月27日
Hierarchical Disentangled Representations
CreateAMind
4+阅读 · 2018年4月15日
Auto-Encoding GAN
CreateAMind
7+阅读 · 2017年8月4日
Arxiv
0+阅读 · 2022年2月21日
Arxiv
4+阅读 · 2021年7月1日
VIP会员
Top
微信扫码咨询专知VIP会员