Search Results for author: Philip Schulz

Found 9 papers, 4 papers with code

Unsupervised Cross-Lingual Transfer of Structured Predictors without Source Data

1 code implementation NAACL 2022 Kemal Kurniawan, Lea Frermann, Philip Schulz, Trevor Cohn

Providing technologies to communities or domains where training data is scarce or protected e. g., for privacy reasons, is becoming increasingly important.

Cross-Lingual Transfer Dependency Parsing +1

PTST-UoM at SemEval-2021 Task 10: Parsimonious Transfer for Sequence Tagging

no code implementations SEMEVAL 2021 Kemal Kurniawan, Lea Frermann, Philip Schulz, Trevor Cohn

This paper describes PTST, a source-free unsupervised domain adaptation technique for sequence tagging, and its application to the SemEval-2021 Task 10 on time expression recognition.

Unsupervised Domain Adaptation

Causal Bias Quantification for Continuous Treatments

no code implementations17 Jun 2021 Gianluca Detommaso, Michael Brückner, Philip Schulz, Victor Chernozhukov

We extend the definition of the marginal causal effect to the continuous treatment setting and develop a novel characterization of causal bias in the framework of structural causal models.

Selection bias

PPT: Parsimonious Parser Transfer for Unsupervised Cross-Lingual Adaptation

1 code implementation EACL 2021 Kemal Kurniawan, Lea Frermann, Philip Schulz, Trevor Cohn

Cross-lingual transfer is a leading technique for parsing low-resource languages in the absence of explicit supervision.

Cross-Lingual Transfer

Variational Inference and Deep Generative Models

no code implementations ACL 2018 Wilker Aziz, Philip Schulz

Using DGMs one can easily design latent variable models that account for missing observations and thereby enable unsupervised and semi-supervised learning with neural networks.

Machine Translation Natural Language Inference +1

A Stochastic Decoder for Neural Machine Translation

1 code implementation ACL 2018 Philip Schulz, Wilker Aziz, Trevor Cohn

The process of translation is ambiguous, in that there are typically many valid trans- lations for a given sentence.

Machine Translation Sentence +3

Fast Collocation-Based Bayesian HMM Word Alignment

no code implementations COLING 2016 Philip Schulz, Wilker Aziz

In order to make our model useful in practice, we devise an auxiliary variable Gibbs sampler that allows us to resample alignment links in constant time independently of the target sentence length.

Language Modelling Machine Translation +3

Cannot find the paper you are looking for? You can Submit a new open access paper.