Conditional Text Generator using Neural NLP — Screencast
An investigation of GPT-2, PPLM and CTRL for automatic text generation conditioned on sentiment and/or text.
Posted by
Sayan Samanta
on May 17, 2020 ·
1 min read
This the screencast of the project where we investigated different language models in order to generate text conditioned on a specific sentiment and/or keywords
The work was done in collaboration with - Hyunjoon Lee, Alexander Berry, Christina Ye and Jason Chan.
Jianmo Ni, Jiacheng Li, Julian McAuley, “Justifying recommendations using distantly-labeled reviews and fined-grained aspects”, Empirical Methods in Natural Language Processing (EMNLP), 2019, https://www.aclweb.org/anthology/D19-1018/
Sumanth Dathathri, Andrea Madotto, Janice Lan, Jane Hung, Eric Frank, Piero Molino, Jason Yosinski, Rosanne Liu, “Plug and Play Language Models: A Simple Approach to Controlled Text Generation”, arXiv Computation and Language, 2019, https://arxiv.org/abs/1912.02164v4
Nitish Shirish Keskar, Bryan McCann, Lav R. Varshney, Caiming Xiong, Richard Socher, “CTRL: A Conditional Transformer Language Model for Controllable Generation”, Salesforce, 2019, https://arxiv.org/abs/1909.05858