Text Style Transfer主要是指Non-Parallel Data条件下的,具体的paper list见:
Delete, Retrieve, Generate: A Simple Approach to Sentiment and Style Transfer (NAACL 2018)
Transforming a sentence to alter a specific attribute while preserving its attribute-independent content.
Training data includes only sentences labeled with their attribute, but not pairs of sentences that differ only in their attributes
Our strongest method extracts content words by deleting phrases associated with the sentence's original attribute value, retrieves new phrases associated with the target attribute, and use a neural model to fluently combine these into a final output.
Training:
For DELETEONLY:
Reconstruct the sentences in the training corpus given their content and original attribute value by maximizing:
For DELETEANDRETRIEVE: apply some noise to a(x, vsrc) to produce a'(x, vsrc)
这篇文章采用Reconstruct的方法来训练模型生成风格化的描述。
Unsupervised Controllable Text Formalization (AAAI 2019)
The crux of the framework is a deep neural encoder-decoder that is reinforced with text-transformation knowledge through auxiliary modules (called scorers)
Style Transformer: Unpaired Text Style Transfer without Disentangled Latent Representation (ACL 2019)
Transfomer Network
To enable style control in the standard Transformer framework, add an extra style embedding as input to the Transformer encoder
z stands for the representation of the encoded inputs
Discriminator Network
Conditional Discriminator: a sentence x and a proposal style s are feed into discriminator and the discriminator is asked to answer whether the input sentence has the corresponding style.
Multi-class Discriminator: only one sentence is feed into the discriminator, and the discriminator aims to answer the style of this sentence.
Learning Algorithm
Discriminator Learning:
conditional discriminator
multi-class discriminator
Transformer Network Learning:
Self Reconstruction
Cycle Reconstruction
Style Controlling