Search Results for author: Yijun Xiao

Found 10 papers, 0 papers with code

Directional FDR Control for Sub-Gaussian Sparse GLMs

no code implementations2 May 2021 Chang Cui, Jinzhu Jia, Yijun Xiao, Huiming Zhang

Using the debiased estimator, we establish multiple testing procedures.

On Hallucination and Predictive Uncertainty in Conditional Language Generation

no code implementations EACL 2021 Yijun Xiao, William Yang Wang

Despite improvements in performances on different natural language generation tasks, deep neural models are prone to hallucinating facts that are incorrect or nonexistent.

Data-to-Text Generation Hallucination +1

Why Neural Machine Translation Prefers Empty Outputs

no code implementations24 Dec 2020 Xing Shi, Yijun Xiao, Kevin Knight

Using different EoS types in target sentences of different lengths exposes and eliminates this implicit smoothing.

Machine Translation NMT +1

Disentangled Representation Learning with Wasserstein Total Correlation

no code implementations30 Dec 2019 Yijun Xiao, William Yang Wang

However, Kullback-Leibler (KL) divergence-based total correlation is metric-agnostic and sensitive to data samples.

Disentanglement

Text Modeling with Syntax-Aware Variational Autoencoders

no code implementations27 Aug 2019 Yijun Xiao, William Yang Wang

We propose syntax-aware variational autoencoders (SAVAEs) that dedicate a subspace in the latent dimensions dubbed syntactic latent to represent syntactic structures of sentences.

Representation Learning

Quantifying Uncertainties in Natural Language Processing Tasks

no code implementations18 Nov 2018 Yijun Xiao, William Yang Wang

Reliable uncertainty quantification is a first step towards building explainable, transparent, and accountable artificial intelligent systems.

Language Modelling named-entity-recognition +4

Dirichlet Variational Autoencoder for Text Modeling

no code implementations31 Oct 2018 Yijun Xiao, Tiancheng Zhao, William Yang Wang

We introduce an improved variational autoencoder (VAE) for text modeling with topic information explicitly modeled as a Dirichlet latent variable.

Tree-Structured Neural Machine for Linguistics-Aware Sentence Generation

no code implementations30 Apr 2017 Ganbin Zhou, Ping Luo, Rongyu Cao, Yijun Xiao, Fen Lin, Bo Chen, Qing He

Then, with a proposed tree-structured search method, the model is able to generate the most probable responses in the form of dependency trees, which are finally flattened into sequences as the system output.

Sentence

Cannot find the paper you are looking for? You can Submit a new open access paper.