Disentangling the content and style in the latent space is prevalent in unpaired text style transfer. However, two major issues exist in most of the current neural models. 1) It is difficult to completely strip the style information from the semantics for a sentence. 2) The recurrent neural network (RNN) based encoder and decoder, mediated by the latent representation, cannot well deal with the issue of the long-term dependency, resulting in poor preservation of non-stylistic semantic content.In this paper, we propose the Style Transformer, which makes no assumption about the latent representation of source sentence and equips the power of attention mechanism in Transformer to achieve better style transfer and better content preservation.
翻译:将潜在空间的内容和风格分离出去,在未受重视的文本样式传输中很普遍,但目前大多数神经模型中都存在两个主要问题。 (1) 很难将风格信息完全从语义中分离出来,用于句子。 (2) 以潜在代表形式为媒介的经常性神经网络(RNN)基于编码器和解码器,无法很好地处理长期依赖问题,导致非现代语义内容的保存不善。