Search Results for author: Xingchen Wan

Found 21 papers, 13 papers with code

Survival of the Most Influential Prompts: Efficient Black-Box Prompt Search via Clustering and Pruning

1 code implementation19 Oct 2023 Han Zhou, Xingchen Wan, Ivan Vulić, Anna Korhonen

Prompt-based learning has been an effective paradigm for large pretrained language models (LLM), enabling few-shot or even zero-shot learning.

Combinatorial Optimization Zero-Shot Learning

Universal Self-Adaptive Prompting

no code implementations24 May 2023 Xingchen Wan, Ruoxi Sun, Hootan Nakhost, Hanjun Dai, Julian Martin Eisenschlos, Sercan O. Arik, Tomas Pfister

A hallmark of modern large language models (LLMs) is their impressive general zero-shot and few-shot abilities, often elicited through in-context learning (ICL) via prompting.

In-Context Learning Natural Language Understanding +2

Better Zero-Shot Reasoning with Self-Adaptive Prompting

no code implementations23 May 2023 Xingchen Wan, Ruoxi Sun, Hanjun Dai, Sercan O. Arik, Tomas Pfister

Modern large language models (LLMs) have demonstrated impressive capabilities at sophisticated tasks, often through step-by-step reasoning similar to humans.

Working Memory Capacity of ChatGPT: An Empirical Study

2 code implementations30 Apr 2023 Dongyu Gong, Xingchen Wan, Dingmin Wang

Working memory is a critical aspect of both human intelligence and artificial intelligence, serving as a workspace for the temporary storage and manipulation of information.

Benchmarking Language Modelling +1

Bayesian Quadrature for Neural Ensemble Search

1 code implementation15 Mar 2023 Saad Hamid, Xingchen Wan, Martin Jørgensen, Binxin Ru, Michael Osborne

Ensembling can improve the performance of Neural Networks, but existing approaches struggle when the architecture likelihood surface has dispersed, narrow peaks.

AutoPEFT: Automatic Configuration Search for Parameter-Efficient Fine-Tuning

1 code implementation28 Jan 2023 Han Zhou, Xingchen Wan, Ivan Vulić, Anna Korhonen

Large pretrained language models are widely used in downstream NLP tasks via task-specific fine-tuning, but such procedures can be costly.

Bayesian Optimisation Neural Architecture Search

Bayesian Optimization over Discrete and Mixed Spaces via Probabilistic Reparameterization

2 code implementations18 Oct 2022 Samuel Daulton, Xingchen Wan, David Eriksson, Maximilian Balandat, Michael A. Osborne, Eytan Bakshy

We prove that under suitable reparameterizations, the BO policy that maximizes the probabilistic objective is the same as that which maximizes the AF, and therefore, PR enjoys the same regret bounds as the original BO policy using the underlying AF.

Bayesian Optimization

Bayesian Generational Population-Based Training

2 code implementations19 Jul 2022 Xingchen Wan, Cong Lu, Jack Parker-Holder, Philip J. Ball, Vu Nguyen, Binxin Ru, Michael A. Osborne

Leveraging the new highly parallelizable Brax physics engine, we show that these innovations lead to large performance gains, significantly outperforming the tuned baseline while learning entire configurations on the fly.

Bayesian Optimization Reinforcement Learning (RL)

Adversarial Attacks on Graph Classifiers via Bayesian Optimisation

1 code implementation NeurIPS 2021 Xingchen Wan, Henry Kenlay, Robin Ru, Arno Blaas, Michael Osborne, Xiaowen Dong

While the majority of the literature focuses on such vulnerability in node-level classification tasks, little effort has been dedicated to analysing adversarial attacks on graph-level classification, an important problem with numerous real-life applications such as biochemistry and social network analysis.

Adversarial Robustness Bayesian Optimisation +1

BOiLS: Bayesian Optimisation for Logic Synthesis

no code implementations11 Nov 2021 Antoine Grosnit, Cedric Malherbe, Rasul Tutunov, Xingchen Wan, Jun Wang, Haitham Bou Ammar

Optimising the quality-of-results (QoR) of circuits during logic synthesis is a formidable challenge necessitating the exploration of exponentially sized search spaces.

Bayesian Optimisation Navigate

Approximate Neural Architecture Search via Operation Distribution Learning

no code implementations8 Nov 2021 Xingchen Wan, Binxin Ru, Pedro M. Esperança, Fabio M. Carlucci

The standard paradigm in Neural Architecture Search (NAS) is to search for a fully deterministic architecture with specific operations and connections.

Bayesian Optimisation Neural Architecture Search

Adversarial Attacks on Graph Classification via Bayesian Optimisation

1 code implementation4 Nov 2021 Xingchen Wan, Henry Kenlay, Binxin Ru, Arno Blaas, Michael A. Osborne, Xiaowen Dong

While the majority of the literature focuses on such vulnerability in node-level classification tasks, little effort has been dedicated to analysing adversarial attacks on graph-level classification, an important problem with numerous real-life applications such as biochemistry and social network analysis.

Adversarial Robustness Bayesian Optimisation +1

Interpretable Neural Architecture Search via Bayesian Optimisation with Weisfeiler-Lehman Kernels

1 code implementation ICLR 2021 Binxin Ru, Xingchen Wan, Xiaowen Dong, Michael Osborne

Our method optimises the architecture in a highly data-efficient manner: it is capable of capturing the topological structures of the architectures and is scalable to large graphs, thus making the high-dimensional and graph-like search spaces amenable to BO.

Bayesian Optimisation Neural Architecture Search

Iterative Averaging in the Quest for Best Test Error

no code implementations2 Mar 2020 Diego Granziol, Xingchen Wan, Samuel Albanie, Stephen Roberts

We analyse and explain the increased generalisation performance of iterate averaging using a Gaussian process perturbation model between the true and batch risk surface on the high dimensional quadratic.

Image Classification

Deep Curvature Suite

1 code implementation20 Dec 2019 Diego Granziol, Xingchen Wan, Timur Garipov

We present MLRG Deep Curvature suite, a PyTorch-based, open-source package for analysis and visualisation of neural network curvature and loss landscape.

Misconceptions

Cannot find the paper you are looking for? You can Submit a new open access paper.