Hierarchical GPT with Congruent Transformers for Multi-Sentence Language Models

18 Sep 2020 Jihyeon Roh Huiseong Gim Soo-Young Lee

We report a GPT-based multi-sentence language model for dialogue generation and document understanding. First, we propose a hierarchical GPT which consists of three blocks, i.e., a sentence encoding block, a sentence generating block, and a sentence decoding block... (read more)

PDF Abstract

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods used in the Paper