Search Results for author: Haoyuan Wu

Found 4 papers, 2 papers with code

ARKS: Active Retrieval in Knowledge Soup for Code Generation

no code implementations19 Feb 2024 Hongjin Su, Shuyang Jiang, Yuhang Lai, Haoyuan Wu, Boao Shi, Che Liu, Qian Liu, Tao Yu

Recently the retrieval-augmented generation (RAG) paradigm has raised much attention for its potential in incorporating external knowledge into large language models (LLMs) without further training.

Code Generation Retrieval

Parameter-Efficient Sparsity Crafting from Dense to Mixture-of-Experts for Instruction Tuning on General Tasks

1 code implementation5 Jan 2024 Haoyuan Wu, Haisheng Zheng, Zhuolun He, Bei Yu

Instruction tuning, a successful paradigm, enhances the ability of LLMs to follow natural language instructions and exhibit robust generalization across a wide range of tasks.

Arithmetic Reasoning Code Generation +5

p-Laplacian Adaptation for Generative Pre-trained Vision-Language Models

1 code implementation17 Dec 2023 Haoyuan Wu, Xinyun Zhang, Peng Xu, Peiyu Liao, Xufeng Yao, Bei Yu

In this paper, we present a novel modeling framework that recasts adapter tuning after attention as a graph message passing process on attention graphs, where the projected query and value features and attention matrix constitute the node features and the graph adjacency matrix, respectively.

Image Captioning Question Answering +3

ChatEDA: A Large Language Model Powered Autonomous Agent for EDA

no code implementations20 Aug 2023 Zhuolun He, Haoyuan Wu, Xinyun Zhang, Xufeng Yao, Su Zheng, Haisheng Zheng, Bei Yu

The integration of a complex set of Electronic Design Automation (EDA) tools to enhance interoperability is a critical concern for circuit designers.

Language Modelling Large Language Model

Cannot find the paper you are looking for? You can Submit a new open access paper.