1 code implementation • 22 Feb 2020 • Bin Wang, Fenxiao Chen, Yuncheng Wang, C. -C. Jay Kuo
Given the fact that word embeddings can capture semantic relationship while semantically similar words tend to form semantic groups in a high-dimensional embedding space, we develop a sentence representation scheme by analyzing semantic subspaces of its constituent words.
1 code implementation • 3 Sep 2019 • Fenxiao Chen, Yuncheng Wang, Bin Wang, C. -C. Jay Kuo
Research on graph representation learning has received a lot of attention in recent years since many data in real-world applications come in form of graphs.
no code implementations • 28 Jan 2019 • Bin Wang, Angela Wang, Fenxiao Chen, Yuncheng Wang, C. -C. Jay Kuo
Extensive evaluation on a large number of word embedding models for language processing applications is conducted in this work.
no code implementations • 4 Sep 2018 • Fenxiao Chen, Bin Wang, C. -C. Jay Kuo
A novel graph-to-tree conversion mechanism called the deep-tree generation (DTG) algorithm is first proposed to predict text data represented by graphs.
1 code implementation • 20 Aug 2018 • Bin Wang, Fenxiao Chen, Angela Wang, C. -C. Jay Kuo
Although embedded vector representations of words offer impressive performance on many natural language processing (NLP) applications, the information of ordered input sequences is lost to some extent if only context-based samples are used in the training.