1 code implementation • 22 Aug 2023 • Xi Xie, Hongwu Peng, Amit Hasan, Shaoyi Huang, Jiahui Zhao, Haowen Fang, Wei zhang, Tong Geng, Omer Khan, Caiwen Ding
Utilizing these principles, we formulated a kernel for sparse matrix multiplication (SpMM) in GCNs that employs block-level partitioning and combined warp strategy.
no code implementations • 21 Jul 2023 • Zhenhang Zhang, Jingang Jin, Haowen Fang, Qinru Qiu
The algorithm not only learns the synaptic weight but also adapts the temporal filters associated to the synapses.
no code implementations • 24 Apr 2023 • Shaoyi Huang, Haowen Fang, Kaleel Mahmood, Bowen Lei, Nuo Xu, Bin Lei, Yue Sun, Dongkuan Xu, Wujie Wen, Caiwen Ding
Experimental results show that NDSNN achieves up to 20. 52\% improvement in accuracy on Tiny-ImageNet using ResNet-19 (with a sparsity of 99\%) as compared to other SOTA methods (e. g., Lottery Ticket Hypothesis (LTH), SET-SNN, RigL-SNN).
no code implementations • 7 Sep 2022 • Nuo Xu, Kaleel Mahmood, Haowen Fang, Ethan Rathbun, Caiwen Ding, Wujie Wen
First, we show that successful white-box adversarial attacks on SNNs are highly dependent on the underlying surrogate gradient technique, even in the case of adversarially trained SNNs.
no code implementations • 8 May 2021 • Amar Shrestha, Haowen Fang, Daniel Patrick Rider, Zaidao Mei, Qinru Qiu
Although widely used in machine learning, backpropagation cannot directly be applied to SNN training and is not feasible on a neuromorphic processor that emulates biological neuron and synapses.
no code implementations • 21 Apr 2021 • Haowen Fang, Brady Taylor, Ziru Li, Zaidao Mei, Hai Li, Qinru Qiu
This circuit implementation of the neuron model is simulated to demonstrate its ability to react to temporal spiking patterns with an adaptive threshold.
no code implementations • 16 Jul 2020 • Bingbing Li, Santosh Pandey, Haowen Fang, Yanjun Lyv, Ji Li, Jieyang Chen, Mimi Xie, Lipeng Wan, Hang Liu, Caiwen Ding
In natural language processing (NLP), the "Transformer" architecture was proposed as the first transduction model replying entirely on self-attention mechanisms without using sequence-aligned recurrent neural networks (RNNs) or convolution, and it achieved significant improvements for sequence to sequence tasks.
no code implementations • 7 Jul 2020 • Haowen Fang, Amar Shrestha, Qinru Qiu
A training algorithm to classify spatial temporal patterns is also proposed.
no code implementations • 6 Jun 2020 • Amar Shrestha, Krittaphat Pugdeethosapol, Haowen Fang, Qinru Qiu
Grounding free-form textual queries necessitates an understanding of these textual phrases and its relation to the visual cues to reliably reason about the described locations.
no code implementations • 22 Mar 2020 • Ziyi Zhao, Haowen Fang, Zhao Jin, Qinru Qiu
The trajectory prediction is a critical and challenging problem in the design of an autonomous driving system.
2 code implementations • 19 Feb 2020 • Haowen Fang, Amar Shrestha, Ziyi Zhao, Qinru Qiu
A bio-plausible SNN model with spatial-temporal property is a complex dynamic system.
no code implementations • 8 Jan 2020 • Amar Shrestha, Krittaphat Pugdeethosapol, Haowen Fang, Qinru Qiu
When the navigational environment is known, it can be represented as a graph where landmarks are nodes, the robot behaviors that move from node to node are edges, and the route is a set of behavioral instructions.