no code implementations • 29 Feb 2024 • Yi Zeng, Feifei Zhao, Yuxuan Zhao, Dongcheng Zhao, Enmeng Lu, Qian Zhang, Yuwei Wang, Hui Feng, Zhuoya Zhao, Jihang Wang, Qingqun Kong, Yinqian Sun, Yang Li, Guobin Shen, Bing Han, Yiting Dong, Wenxuan Pan, Xiang He, Aorigele Bao, Jin Wang
In this paper, we introduce a Brain-inspired and Self-based Artificial Intelligence (BriSe AI) paradigm.
no code implementations • 18 Sep 2023 • Bing Han, Feifei Zhao, Wenxuan Pan, Zhaoya Zhao, Xianqi Li, Qingqun Kong, Yi Zeng
In this paper, we propose a brain-inspired continual learning algorithm with adaptive reorganization of neural pathways, which employs Self-Organizing Regulation networks to reorganize the single and limited Spiking Neural Network (SOR-SNN) into rich sparse neural pathways to efficiently cope with incremental tasks.
no code implementations • 11 Sep 2023 • Wenxuan Pan, Feifei Zhao, Zhuoya Zhao, Yi Zeng
This work explores brain-inspired neural architectures suitable for SNNs and also provides preliminary insights into the evolutionary mechanisms of biological neural networks in the human brain.
1 code implementation • 9 Aug 2023 • Bing Han, Feifei Zhao, Yi Zeng, Wenxuan Pan, Guobin Shen
In addition, the overlapping shared structure helps to quickly leverage all acquired knowledge to new tasks, empowering a single network capable of supporting multiple incremental tasks (without the separate sub-network mask for each task).
no code implementations • 21 Apr 2023 • Wenxuan Pan, Feifei Zhao, Guobin Shen, Yi Zeng
The neural motifs topology, modular regional structure and global cross-brain region connection of the human brain are the product of natural evolution and can serve as a perfect reference for designing brain-inspired SNN architecture.
no code implementations • 31 Mar 2023 • Wenxuan Pan, Feifei Zhao, Yi Zeng, Bing Han
For structural evolution, an adaptive evolvable LSM model is developed to optimize the neural architecture design of liquid layer with separation property.
no code implementations • 22 Nov 2022 • Bing Han, Feifei Zhao, Yi Zeng, Wenxuan Pan
Experimental results on spatial (MNIST, CIFAR-10) and temporal neuromorphic (N-MNIST, DVS-Gesture) datasets demonstrate that our method can flexibly learn appropriate compression rate for various tasks and effectively achieve superior performance while massively reducing the network energy consumption.