CCG Supertagging
8 papers with code • 1 benchmarks • 2 datasets
Combinatory Categorical Grammar (CCG; Steedman, 2000) is a highly lexicalized formalism. The standard parsing model of Clark and Curran (2007) uses over 400 lexical categories (or supertags), compared to about 50 part-of-speech tags for typical parsers.
Example:
Vinken | , | 61 | years | old |
---|---|---|---|---|
N | , | N/N | N | (S[adj]\ NP)\ NP |
Latest papers with no code
Supertagging with CCG primitives
In this paper, we make use of the primitives and operators that constitute the lexical categories of categorial grammars.
Probing What Different NLP Tasks Teach Machines about Function Word Comprehension
Our results show that pretraining on language modeling performs the best on average across our probing tasks, supporting its widespread use for pretraining state-of-the-art NLP models, and CCG supertagging and NLI pretraining perform comparably.
An Empirical Investigation of Global and Local Normalization for Recurrent Neural Sequence Models Using a Continuous Relaxation to Beam Search
Globally normalized neural sequence models are considered superior to their locally normalized equivalents because they may ameliorate the effects of label bias.
Language Modeling Teaches You More than Translation Does: Lessons Learned Through Auxiliary Syntactic Task Analysis
Recently, researchers have found that deep LSTMs trained on tasks like machine translation learn substantial syntactic and semantic information about their input sentences, including part-of-speech.
A Continuous Relaxation of Beam Search for End-to-end Training of Neural Sequence Models
In experiments, we show that optimizing this new training objective yields substantially better results on two sequence tasks (Named Entity Recognition and CCG Supertagging) when compared with both cross entropy trained greedy decoding and cross entropy trained beam decoding baselines.
Initial Explorations of CCG Supertagging for Universal Dependency Parsing
In this paper we describe the system by METU team for universal dependency parsing of multilingual text.
An Empirical Exploration of Skip Connections for Sequential Tagging
In this paper, we empirically explore the effects of various kinds of skip connections in stacked bidirectional LSTMs for sequential tagging.
A Dynamic Window Neural Network for CCG Supertagging
These motivate us to build a supertagger with a dynamic window approach, which can be treated as an attention mechanism on the local contexts.