Search Results for author: Boxun Xu

Found 2 papers, 0 papers with code

DISTA: Denoising Spiking Transformer with intrinsic plasticity and spatiotemporal attention

no code implementations15 Nov 2023 Boxun Xu, Hejia Geng, Yuxuan Yin, Peng Li

We introduce DISTA, a Denoising Spiking Transformer with Intrinsic Plasticity and SpatioTemporal Attention, designed to maximize the spatiotemporal computational prowess of spiking neurons, particularly for vision applications.

Denoising

UPAR: A Kantian-Inspired Prompting Framework for Enhancing Large Language Model Capabilities

no code implementations30 Sep 2023 Hejia Geng, Boxun Xu, Peng Li

Large Language Models (LLMs) have demonstrated impressive inferential capabilities, with numerous research endeavors devoted to enhancing this capacity through prompting.

Causal Judgment GSM8K +3

Cannot find the paper you are looking for? You can Submit a new open access paper.