Search Results for author: Zexuan Zhong

Found 15 papers, 11 papers with code

Reliable, Adaptable, and Attributable Language Models with Retrieval

no code implementations5 Mar 2024 Akari Asai, Zexuan Zhong, Danqi Chen, Pang Wei Koh, Luke Zettlemoyer, Hannaneh Hajishirzi, Wen-tau Yih

Parametric language models (LMs), which are trained on vast amounts of web data, exhibit remarkable flexibility and capability.

Question Answering Retrieval

REST: Retrieval-Based Speculative Decoding

1 code implementation14 Nov 2023 Zhenyu He, Zexuan Zhong, Tianle Cai, Jason D. Lee, Di He

We introduce Retrieval-Based Speculative Decoding (REST), a novel algorithm designed to speed up language model generation.

Language Modelling Retrieval +1

Poisoning Retrieval Corpora by Injecting Adversarial Passages

1 code implementation29 Oct 2023 Zexuan Zhong, Ziqing Huang, Alexander Wettig, Danqi Chen

Dense retrievers have achieved state-of-the-art performance in various information retrieval tasks, but to what extent can they be safely deployed in real-world applications?

Information Retrieval Natural Questions +1

Privacy Implications of Retrieval-Based Language Models

1 code implementation24 May 2023 Yangsibo Huang, Samyak Gupta, Zexuan Zhong, Kai Li, Danqi Chen

Crucially, we find that $k$NN-LMs are more susceptible to leaking private information from their private datastore than parametric models.

Retrieval

MQuAKE: Assessing Knowledge Editing in Language Models via Multi-Hop Questions

2 code implementations24 May 2023 Zexuan Zhong, Zhengxuan Wu, Christopher D. Manning, Christopher Potts, Danqi Chen

The information stored in large language models (LLMs) falls out of date quickly, and retraining from scratch is often not an option.

knowledge editing Language Modelling +2

Training Language Models with Memory Augmentation

1 code implementation25 May 2022 Zexuan Zhong, Tao Lei, Danqi Chen

Recent work has improved language models (LMs) remarkably by equipping them with a non-parametric memory component.

Language Modelling Machine Translation

Recovering Private Text in Federated Learning of Language Models

1 code implementation17 May 2022 Samyak Gupta, Yangsibo Huang, Zexuan Zhong, Tianyu Gao, Kai Li, Danqi Chen

For the first time, we show the feasibility of recovering text from large batch sizes of up to 128 sentences.

Federated Learning Word Embeddings

Structured Pruning Learns Compact and Accurate Models

2 code implementations ACL 2022 Mengzhou Xia, Zexuan Zhong, Danqi Chen

The growing size of neural language models has led to increased attention in model compression.

Model Compression

Simple Entity-Centric Questions Challenge Dense Retrievers

1 code implementation EMNLP 2021 Christopher Sciavolino, Zexuan Zhong, Jinhyuk Lee, Danqi Chen

Open-domain question answering has exploded in popularity recently due to the success of dense retrieval models, which have surpassed sparse models using only a few supervised training examples.

Data Augmentation Open-Domain Question Answering +2

Factual Probing Is [MASK]: Learning vs. Learning to Recall

2 code implementations NAACL 2021 Zexuan Zhong, Dan Friedman, Danqi Chen

Petroni et al. (2019) demonstrated that it is possible to retrieve world facts from a pre-trained language model by expressing them as cloze-style prompts and interpret the model's prediction accuracy as a lower bound on the amount of factual information it encodes.

Language Modelling

A Frustratingly Easy Approach for Entity and Relation Extraction

2 code implementations NAACL 2021 Zexuan Zhong, Danqi Chen

Our approach essentially builds on two independent encoders and merely uses the entity model to construct the input for the relation model.

Joint Entity and Relation Extraction Multi-Task Learning +3

MULDEF: Multi-model-based Defense Against Adversarial Examples for Neural Networks

no code implementations31 Aug 2018 Siwakorn Srisakaokul, Yuhao Zhang, Zexuan Zhong, Wei Yang, Tao Xie, Bo Li

In particular, given a target model, our framework includes multiple models (constructed from the target model) to form a model family.

Cannot find the paper you are looking for? You can Submit a new open access paper.