Mixture of Experts (MoE) models scale capacity but often suffer from representation collapse and gradient instability. We propose Dynamic Subspace Composition (DSC), a framework that approximates context-dependent weights via a state-dependent, sparse expansion of a shared basis bank. Formally, DSC models the weight update as a residual trajectory within a Star- Shaped Domain, employing a Magnitude-Gated Simplex Interpolation to ensure continuity at the identity. Unlike standard Mixture-of-LoRAs, which incurs O(M rd) parameter complexity by retrieving independent rank-r matrices, DSC constructs a compositional rank-K approximation from decoupled unit-norm basis vectors. This reduces parameter complexity to O(M d) and memory traffic to O(Kd), while Frame-Theoretic regularization and spectral constraints provide rigorous worst-case bounds on the dynamic update. The code is available at https://github. com/VladimerKhasia/DSC
翻译:专家混合模型虽能扩展容量,但常面临表征坍缩与梯度不稳定问题。本文提出动态子空间组合框架,该框架通过共享基库的状态依赖稀疏扩展来近似上下文相关权重。形式上,DSC将权重更新建模为星形域内的残差轨迹,采用幅度门控单纯形插值以确保在恒等变换处的连续性。与标准LoRA混合方法需检索独立秩-r矩阵而导致O(M rd)参数复杂度不同,DSC通过解耦的单位范数基向量构建组合式秩-K近似。该方法将参数复杂度降至O(M d),内存访问量减至O(Kd),同时通过框架理论正则化与谱约束为动态更新提供严格的最坏情况边界。代码发布于https://github.com/VladimerKhasia/DSC