CTRL: A Conditional Transformer Language Model for Controllable Generation

CTRL: A Conditional Transformer Language Model for Controllable Generation

20 Sep 2019 | Nitish Shirish Keskar; Bryan McCann; Lav R. Varshney, Caiming Xiong, Richard Socher
CTRL is a 1.63 billion-parameter conditional transformer language model designed to generate text with explicit control over style, content, and task-specific behavior. The model is trained on control codes derived from the natural structure of raw text, allowing for more precise control over the generated text. These codes can specify domain, style, topics, dates, entities, relationships, and task-related behavior. CTRL can also predict which parts of the training data are most likely given a sequence, providing a method for analyzing large amounts of text through model-based source attribution. The release of CTRL aims to advance controllable generation in natural language processing and encourages further research in this area.CTRL is a 1.63 billion-parameter conditional transformer language model designed to generate text with explicit control over style, content, and task-specific behavior. The model is trained on control codes derived from the natural structure of raw text, allowing for more precise control over the generated text. These codes can specify domain, style, topics, dates, entities, relationships, and task-related behavior. CTRL can also predict which parts of the training data are most likely given a sequence, providing a method for analyzing large amounts of text through model-based source attribution. The release of CTRL aims to advance controllable generation in natural language processing and encourages further research in this area.
Reach us at info@study.space