Book summarization

5 papers with code • 0 benchmarks • 0 datasets

This task has no description! Would you like to contribute one?

Most implemented papers

Leave No Context Behind: Efficient Infinite Context Transformers with Infini-attention

Beomi/InfiniTransformer 10 Apr 2024

This work introduces an efficient method to scale Transformer-based Large Language Models (LLMs) to infinitely long inputs with bounded memory and computation.

Enhancing Large Language Model with Self-Controlled Memory Framework

wbbeyourself/scm4llms 26 Apr 2023

Large Language Models (LLMs) are constrained by their inability to process lengthy inputs, resulting in the loss of critical historical information.

Unlimiformer: Long-Range Transformers with Unlimited Length Input

abertsch72/unlimiformer NeurIPS 2023

This kNN index can be kept on either the GPU or CPU memory and queried in sub-linear time; this way, we can index practically unlimited input sequences, while every attention head in every decoder layer retrieves its top-k keys, instead of attending to every key.

Echoes from Alexandria: A Large Resource for Multilingual Book Summarization

babelscape/echoes-from-alexandria 7 Jun 2023

In recent years, research in text summarization has mainly focused on the news domain, where texts are typically short and have strong layout features.

LOCOST: State-Space Models for Long Document Abstractive Summarization

flbbb/locost-summarization 31 Jan 2024

State-space models are a low-complexity alternative to transformers for encoding long sequences and capturing long-term dependencies.