Search Results for author: Qingyang Zhu

Found 3 papers, 2 papers with code

Generative Pretrained Structured Transformers: Unsupervised Syntactic Language Models at Scale

2 code implementations13 Mar 2024 Xiang Hu, Pengyu Ji, Qingyang Zhu, Wei Wu, Kewei Tu

A syntactic language model (SLM) incrementally generates a sentence with its syntactic tree in a left-to-right manner.

Language Modelling Sentence +1

Augmenting Transformers with Recursively Composed Multi-grained Representations

1 code implementation28 Sep 2023 Xiang Hu, Qingyang Zhu, Kewei Tu, Wei Wu

More interestingly, the hierarchical structures induced by ReCAT exhibit strong consistency with human-annotated syntactic trees, indicating good interpretability brought by the CIO layers.

Constituency Grammar Induction Natural Language Inference +1

Domain Generalization Deep Graph Transformation

no code implementations19 May 2023 Shiyu Wang, Guangji Bai, Qingyang Zhu, Zhaohui Qin, Liang Zhao

As a result, domain generalization graph transformation that predicts graphs not available in the training data is under-explored, with multiple key challenges to be addressed including (1) the extreme space complexity when training on all input-output mode combinations, (2) difference of graph topologies between the input and the output modes, and (3) how to generalize the model to (unseen) target domains that are not in the training data.

Domain Generalization Link Prediction

Cannot find the paper you are looking for? You can Submit a new open access paper.