CCG Supertagging

8 papers with code • 1 benchmarks • 2 datasets

Combinatory Categorical Grammar (CCG; Steedman, 2000) is a highly lexicalized formalism. The standard parsing model of Clark and Curran (2007) uses over 400 lexical categories (or supertags), compared to about 50 part-of-speech tags for typical parsers.

Example:

Vinken , 61 years old
N , N/N N (S[adj]\ NP)\ NP

Latest papers with no code

Supertagging with CCG primitives

no code yet • WS 2020

In this paper, we make use of the primitives and operators that constitute the lexical categories of categorial grammars.

Probing What Different NLP Tasks Teach Machines about Function Word Comprehension

no code yet • SEMEVAL 2019

Our results show that pretraining on language modeling performs the best on average across our probing tasks, supporting its widespread use for pretraining state-of-the-art NLP models, and CCG supertagging and NLI pretraining perform comparably.

An Empirical Investigation of Global and Local Normalization for Recurrent Neural Sequence Models Using a Continuous Relaxation to Beam Search

no code yet • NAACL 2019

Globally normalized neural sequence models are considered superior to their locally normalized equivalents because they may ameliorate the effects of label bias.

Language Modeling Teaches You More than Translation Does: Lessons Learned Through Auxiliary Syntactic Task Analysis

no code yet • WS 2018

Recently, researchers have found that deep LSTMs trained on tasks like machine translation learn substantial syntactic and semantic information about their input sentences, including part-of-speech.

A Continuous Relaxation of Beam Search for End-to-end Training of Neural Sequence Models

no code yet • 1 Aug 2017

In experiments, we show that optimizing this new training objective yields substantially better results on two sequence tasks (Named Entity Recognition and CCG Supertagging) when compared with both cross entropy trained greedy decoding and cross entropy trained beam decoding baselines.

Initial Explorations of CCG Supertagging for Universal Dependency Parsing

no code yet • CONLL 2017

In this paper we describe the system by METU team for universal dependency parsing of multilingual text.

An Empirical Exploration of Skip Connections for Sequential Tagging

no code yet • COLING 2016

In this paper, we empirically explore the effects of various kinds of skip connections in stacked bidirectional LSTMs for sequential tagging.

A Dynamic Window Neural Network for CCG Supertagging

no code yet • 10 Oct 2016

These motivate us to build a supertagger with a dynamic window approach, which can be treated as an attention mechanism on the local contexts.