Search Results for author: Xianghong Fang

Found 5 papers, 2 papers with code

Controlled Text Generation Using Dictionary Prior in Variational Autoencoders

no code implementations Findings (ACL) 2022 Xianghong Fang, Jian Li, Lifeng Shang, Xin Jiang, Qun Liu, Dit-yan Yeung

While variational autoencoders (VAEs) have been widely applied in text generation tasks, they are troubled by two challenges: insufficient representation capacity and poor controllability.

Contrastive Learning Language Modelling +2

Rethinking The Uniformity Metric in Self-Supervised Learning

1 code implementation1 Mar 2024 Xianghong Fang, Jian Li, Qiang Sun, Benyou Wang

Uniformity plays a crucial role in the assessment of learned representations, contributing to a deeper comprehension of self-supervised learning.

Self-Supervised Learning

Discrete Auto-regressive Variational Attention Models for Text Modeling

1 code implementation16 Jun 2021 Xianghong Fang, Haoli Bai, Jian Li, Zenglin Xu, Michael Lyu, Irwin King

We further design discrete latent space for the variational attention and mathematically show that our model is free from posterior collapse.

Language Modelling

Discrete Variational Attention Models for Language Generation

no code implementations21 Apr 2020 Xianghong Fang, Haoli Bai, Zenglin Xu, Michael Lyu, Irwin King

Variational autoencoders have been widely applied for natural language generation, however, there are two long-standing problems: information under-representation and posterior collapse.

Language Modelling Text Generation

DART: Domain-Adversarial Residual-Transfer Networks for Unsupervised Cross-Domain Image Classification

no code implementations30 Dec 2018 Xianghong Fang, Haoli Bai, Ziyi Guo, Bin Shen, Steven Hoi, Zenglin Xu

In this paper, we propose a new unsupervised domain adaptation method named Domain-Adversarial Residual-Transfer (DART) learning of Deep Neural Networks to tackle cross-domain image classification tasks.

Classification General Classification +2

Cannot find the paper you are looking for? You can Submit a new open access paper.