Modeling nonlinear systems with Volterra series is challenging because the number of kernel coefficients grows exponentially with the model order. This work introduces Bayesian Tensor Network Volterra kernel machines (BTN-V), extending the Bayesian Tensor Network framework to Volterra system identification. BTN-V represents Volterra kernels using canonical polyadic decomposition, reducing model complexity from O(I^D) to O(DIR). By treating all tensor components and hyperparameters as random variables, BTN-V provides predictive uncertainty estimation at no additional computational cost. Sparsity-inducing hierarchical priors enable automatic rank determination and the learning of fading-memory behavior directly from data, improving interpretability and preventing overfitting. Empirical results demonstrate competitive accuracy, enhanced uncertainty quantification, and reduced computational cost.
翻译:利用Volterra级数建模非线性系统具有挑战性,因为核系数数量随模型阶数呈指数增长。本研究提出了贝叶斯张量网络Volterra核机(BTN-V),将贝叶斯张量网络框架扩展至Volterra系统辨识领域。BTN-V采用规范多分量分解表示Volterra核,将模型复杂度从O(I^D)降低至O(DIR)。通过将所有张量分量和超参数视为随机变量,BTN-V无需额外计算成本即可提供预测不确定性估计。稀疏诱导的层次先验支持自动秩确定,并能直接从数据中学习衰减记忆行为,从而提升模型可解释性并防止过拟合。实证结果表明,该方法在保持竞争性精度的同时,增强了不确定性量化能力并降低了计算成本。