Neural autoregressive sequence models are used to generate sequences in a variety of natural language processing (NLP) tasks, where they are evaluated according to sequence-level task losses. These models are typically trained with maximum likelihood estimation, which ignores the task loss, yet empirically performs well as a surrogate objective... (read more)
PDFMETHOD | TYPE | |
---|---|---|
![]() |
Hyperparameter Search |