Search Results for author: Yuming Liu

Found 7 papers, 4 papers with code

EDTER: Edge Detection with Transformer

1 code implementation CVPR 2022 Mengyang Pu, Yaping Huang, Yuming Liu, Qingji Guan, Haibin Ling

In Stage I, a global transformer encoder is used to capture long-range global context on coarse-grained image patches.

Edge Detection

HousE: Knowledge Graph Embedding with Householder Parameterization

1 code implementation16 Feb 2022 Rui Li, Jianan Zhao, Chaozhuo Li, Di He, Yiqi Wang, Yuming Liu, Hao Sun, Senzhang Wang, Weiwei Deng, Yanming Shen, Xing Xie, Qi Zhang

The effectiveness of knowledge graph embedding (KGE) largely depends on the ability to model intrinsic relation patterns and mapping properties.

Knowledge Graph Embedding Relation +1

Gophormer: Ego-Graph Transformer for Node Classification

no code implementations25 Oct 2021 Jianan Zhao, Chaozhuo Li, Qianlong Wen, Yiqi Wang, Yuming Liu, Hao Sun, Xing Xie, Yanfang Ye

Existing graph transformer models typically adopt fully-connected attention mechanism on the whole input graph and thus suffer from severe scalability issues and are intractable to train in data insufficient cases.

Classification Data Augmentation +3

AdsGNN: Behavior-Graph Augmented Relevance Modeling in Sponsored Search

1 code implementation25 Apr 2021 Chaozhuo Li, Bochen Pang, Yuming Liu, Hao Sun, Zheng Liu, Xing Xie, Tianqi Yang, Yanling Cui, Liangjie Zhang, Qi Zhang

Our motivation lies in incorporating the tremendous amount of unsupervised user behavior data from the historical search logs as the complementary graph to facilitate relevance modeling.

Marketing

TextGNN: Improving Text Encoder via Graph Neural Network in Sponsored Search

2 code implementations15 Jan 2021 Jason Yue Zhu, Yanling Cui, Yuming Liu, Hao Sun, Xue Li, Markus Pelger, Tianqi Yang, Liangjie Zhang, Ruofei Zhang, Huasha Zhao

Text encoders based on C-DSSM or transformers have demonstrated strong performance in many Natural Language Processing (NLP) tasks.

Natural Language Understanding

Cannot find the paper you are looking for? You can Submit a new open access paper.