Citation Intent Classification

10 papers with code • 2 benchmarks • 4 datasets

Identifying the reason why an author cited another author.

Libraries

Use these libraries to find Citation Intent Classification models and implementations

CitePrompt: Using Prompts to Identify Citation Intent in Scientific Papers

avisheklahiri/citeprompt 25 Apr 2023

For the ACL-ARC dataset, we report a 53. 86% F1 score for the zero-shot setting, which improves to 63. 61% and 66. 99% for the 5-shot and 10-shot settings, respectively.

7
25 Apr 2023

Cross-Lingual Citations in English Papers: A Large-Scale Analysis of Prevalence, Usage, and Impact

illdepence/cross-lingual-citations-from-en 7 Nov 2021

Citation information in scholarly data is an important source of insight into the reception of publications and the scholarly discourse.

1
07 Nov 2021

ImpactCite: An XLNet-based method for Citation Impact Analysis

DominiqueMercier/ImpactCite 5 May 2020

Therefore, citation impact analysis (which includes sentiment and intent classification) enables us to quantify the quality of the citations which can eventually assist us in the estimation of ranking and impact.

5
05 May 2020

Don't Stop Pretraining: Adapt Language Models to Domains and Tasks

allenai/dont-stop-pretraining ACL 2020

Language models pretrained on text from a wide variety of sources form the foundation of today's NLP.

514
23 Apr 2020

Structural Scaffolds for Citation Intent Classification in Scientific Publications

allenai/scicite NAACL 2019

Identifying the intent of a citation in scientific papers (e. g., background information, use of methods, comparing results) is critical for machine reading of individual publications and automated analysis of the scientific literature.

109
02 Apr 2019

SciBERT: A Pretrained Language Model for Scientific Text

allenai/scibert IJCNLP 2019

Obtaining large-scale annotated data for NLP tasks in the scientific domain is challenging and expensive.

1,419
26 Mar 2019

BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding

huggingface/transformers NAACL 2019

We introduce a new language representation model called BERT, which stands for Bidirectional Encoder Representations from Transformers.

123,180
11 Oct 2018

Deep contextualized word representations

flairNLP/flair NAACL 2018

We introduce a new type of deep contextualized word representation that models both (1) complex characteristics of word use (e. g., syntax and semantics), and (2) how these uses vary across linguistic contexts (i. e., to model polysemy).

13,503
15 Feb 2018