no code implementations • 25 Mar 2024 • Huifeng Yin, Hanle Zheng, Jiayi Mao, Siyuan Ding, Xing Liu, Mingkun Xu, Yifan Hu, Jing Pei, Lei Deng
By designing and evaluating several variants of the classic model, we systematically investigate the functional roles of key modelling components, leakage, reset, and recurrence, in leaky integrate-and-fire (LIF) based SNNs.
no code implementations • 25 Mar 2024 • Huifeng Yin, Mingkun Xu, Jing Pei, Lei Deng
Graph representation learning has become a crucial task in machine learning and data mining due to its potential for modeling complex structures such as social networks, chemical compounds, and biological systems.
no code implementations • 30 Jun 2021 • Mingkun Xu, Yujie Wu, Lei Deng, Faqiang Liu, Guoqi Li, Jing Pei
Biological spiking neurons with intrinsic dynamics underlie the powerful representation and learning capabilities of the brain for processing multimodal information in complex environments.
no code implementations • 5 Oct 2020 • Xuming Ran, Mingkun Xu, Qi Xu, Huihui Zhou, Quanying Liu
The likelihood-based generative models have been reported to be highly robust to the out-of-distribution (OOD) inputs and can be a detector by assuming that the model assigns higher likelihoods to the samples from the in-distribution (ID) dataset than an OOD dataset.
no code implementations • 16 Jul 2020 • Xuming Ran, Mingkun Xu, Lingrui Mei, Qi Xu, Quanying Liu
To address this problem, a reliable uncertainty estimation is considered to be critical for in-depth understanding of OOD inputs.
no code implementations • 5 Jun 2020 • Yujie Wu, Rong Zhao, Jun Zhu, Feng Chen, Mingkun Xu, Guoqi Li, Sen Song, Lei Deng, Guanrui Wang, Hao Zheng, Jing Pei, Youhui Zhang, Mingguo Zhao, Luping Shi
We demonstrate the advantages of this model in multiple different tasks, including few-shot learning, continual learning, and fault-tolerance learning in neuromorphic vision sensors.
1 code implementation • 20 Dec 2019 • Faqiang Liu, Mingkun Xu, Guoqi Li, Jing Pei, Luping Shi, Rong Zhao
Generative adversarial networks have achieved remarkable performance on various tasks but suffer from training instability.
no code implementations • 25 Sep 2019 • Faqiang Liu, Mingkun Xu, Guoqi Li, Jing Pei, Luping Shi
Generative adversarial networks have achieved remarkable performance on various tasks but suffer from sensitivity to hyper-parameters, training instability, and mode collapse.