Search Results for author: Zhao Meng

Found 14 papers, 8 papers with code

TempCaps: A Capsule Network-based Embedding Model for Temporal Knowledge Graph Completion

1 code implementation spnlp (ACL) 2022 Guirong Fu, Zhao Meng, Zhen Han, Zifeng Ding, Yunpu Ma, Matthias Schubert, Volker Tresp, Roger Wattenhofer

In this paper, we tackle the temporal knowledge graph completion task by proposing TempCaps, which is a Capsule network-based embedding model for Temporal knowledge graph completion.

Entity Embeddings Temporal Knowledge Graph Completion

Beyond Prompting: Making Pre-trained Language Models Better Zero-shot Learners by Clustering Representations

1 code implementation29 Oct 2022 Yu Fei, Ping Nie, Zhao Meng, Roger Wattenhofer, Mrinmaya Sachan

We further explore the applicability of our clustering approach by evaluating it on 14 datasets with more diverse topics, text lengths, and numbers of classes.

Clustering Sentence +7

BERT is Robust! A Case Against Synonym-Based Adversarial Examples in Text Classification

no code implementations15 Sep 2021 Jens Hauser, Zhao Meng, Damián Pascual, Roger Wattenhofer

We combine a human evaluation of individual word substitutions and a probabilistic analysis to show that between 96% and 99% of the analyzed attacks do not preserve semantics, indicating that their success is mainly based on feeding poor data to the model.

Data Augmentation text-classification +1

Self-Supervised Contrastive Learning with Adversarial Perturbations for Defending Word Substitution-based Attacks

1 code implementation Findings (NAACL) 2022 Zhao Meng, Yihan Dong, Mrinmaya Sachan, Roger Wattenhofer

In this paper, we present an approach to improve the robustness of BERT language models against word substitution-based adversarial attacks by leveraging adversarial perturbations for self-supervised contrastive learning.

Adversarial Attack Contrastive Learning +1

KM-BART: Knowledge Enhanced Multimodal BART for Visual Commonsense Generation

1 code implementation ACL 2021 Yiran Xing, Zai Shi, Zhao Meng, Gerhard Lakemeyer, Yunpu Ma, Roger Wattenhofer

We present Knowledge Enhanced Multimodal BART (KM-BART), which is a Transformer-based sequence-to-sequence model capable of reasoning about commonsense knowledge from multimodal inputs of images and texts.

Knowledge Graphs Language Modelling +1

A Geometry-Inspired Attack for Generating Natural Language Adversarial Examples

no code implementations COLING 2020 Zhao Meng, Roger Wattenhofer

Generating adversarial examples for natural language is hard, as natural language consists of discrete symbols, and examples are often of variable lengths.

How Transferable are Neural Networks in NLP Applications?

no code implementations EMNLP 2016 Lili Mou, Zhao Meng, Rui Yan, Ge Li, Yan Xu, Lu Zhang, Zhi Jin

Transfer learning is aimed to make use of valuable knowledge in a source domain to help model performance in a target domain.

Transfer Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.