no code implementations • 23 Mar 2024 • Viet Dung Nguyen, Reynold Bailey, Gabriel J. Diaz, Chengyi Ma, Alexander Fix, Alexander Ororbia
In remedy, we use dimensionality-reduction techniques to measure the overlap between the target eye images and synthetic training data, and to prune the training dataset in a manner that maximizes distribution overlap.
no code implementations • 19 Feb 2024 • Hitesh Vaidya, Travis Desell, Ankur Mali, Alexander Ororbia
The major challenge that makes crafting such a system difficult is known as catastrophic forgetting - an agent, such as one based on artificial neural networks (ANNs), struggles to retain previously acquired knowledge when learning from new samples.
no code implementations • 16 Feb 2024 • Alexander Ororbia, Ankur Mali, Adam Kohan, Beren Millidge, Tommaso Salvatori
As a result, it accommodates hardware and scientific modeling, e. g. learning with physical systems and non-differentiable behavior.
no code implementations • 12 Jan 2024 • Zimeng Lyu, Alexander Ororbia, Rui Li, Travis Desell
In this work, we introduce a semi-supervised learning approach based on topological projections in self-organizing maps (SOMs), which significantly reduces the required number of labeled data points to perform parameter prediction, effectively exploiting information contained in large unlabeled datasets.
no code implementations • 16 Nov 2023 • Alexander Ororbia, Karl Friston
This review motivates and synthesizes research efforts in neuroscience-inspired artificial intelligence and biomimetic computing in terms of mortal computation.
no code implementations • 14 Oct 2023 • Alexander Ororbia, Mary Alexandria Kelly
Over the last few years, large neural generative models, capable of synthesizing semantically rich passages of text or producing complex images, have recently emerged as a popular representation of what has come to be known as ``generative artificial intelligence'' (generative AI).
no code implementations • 26 Sep 2023 • Ankur Mali, Alexander Ororbia, Daniel Kifer, Lee Giles
In this work, we extend the theoretical foundation for the $2^{nd}$-order recurrent network ($2^{nd}$ RNN) and prove there exists a class of a $2^{nd}$ RNN that is Turing-complete with bounded time.
no code implementations • 15 Aug 2023 • Tommaso Salvatori, Ankur Mali, Christopher L. Buckley, Thomas Lukasiewicz, Rajesh P. N. Rao, Karl Friston, Alexander Ororbia
Artificial intelligence (AI) is rapidly becoming one of the key technologies of this century.
1 code implementation • Findings of the Association for Computational Linguistics: ACL 2023 2023 • Tharindu Cyril Weerasooriya, Alexander Ororbia, Raj Bhensadadia, Ashiqur KhudaBukhsh, Christopher Homan
Annotator disagreement is common whenever human judgment is needed for supervised learning.
1 code implementation • 11 May 2023 • AbdElRahman ElSaid, Karl Ricanek, Zeming Lyu, Alexander Ororbia, Travis Desell
Continuous Ant-based Topology Search (CANTS) is a previously introduced novel nature-inspired neural architecture search (NAS) algorithm that is based on ant colony optimization (ACO).
no code implementations • 30 Mar 2023 • Alexander Ororbia
We develop a neuro-mimetic architecture, composed of spiking neuronal units, where individual layers of neurons operate in parallel and adapt their synaptic efficacies without the use of feedback pathways.
no code implementations • 20 Feb 2023 • Zimeng Lyu, Alexander Ororbia, Travis Desell
Results demonstrate that ONE-NAS outperforms traditional statistical time series forecasting methods, including online linear regression, fixed long short-term memory (LSTM) and gated recurrent unit (GRU) models trained online, as well as state-of-the-art, online ARIMA strategies.
1 code implementation • 4 Jan 2023 • Alexander Ororbia, Ankur Mali
We propose the predictive forward-forward (PFF) algorithm for conducting credit assignment in neural systems.
no code implementations • 22 Nov 2022 • Alexander Ororbia, Ankur Mali
In this work, we develop convolutional neural generative coding (Conv-NGC), a generalization of predictive coding to the case of convolution/deconvolution-based computation.
no code implementations • 16 Nov 2022 • Zhizhuo Yang, Gabriel J. Diaz, Brett R. Fajen, Reynold Bailey, Alexander Ororbia
The active inference framework (AIF) is a promising new computational framework grounded in contemporary neuroscience that can produce human-like behavior through reward-based learning.
no code implementations • 19 Sep 2022 • Alexander Ororbia, Ankur Mali
In this article, we propose a backpropagation-free approach to robotic control through the neuro-cognitive computational framework of neural generative coding (NGC), designing an agent built completely from powerful predictive coding/processing circuits that facilitate dynamic, online learning from sparse rewards, embodying the principles of planning-as-inference.
no code implementations • 31 Mar 2022 • Alexander Ororbia, M. Alex Kelly
We present the COGnitive Neural GENerative system (CogNGen), a cognitive architecture that combines two neurobiologically-plausible, computational models: predictive processing and hyperdimensional/vector-symbolic models.
no code implementations • 27 Jan 2022 • Ankur Mali, Alexander Ororbia, Daniel Kifer, Lee Giles
Recent advances in deep learning have resulted in image compression algorithms that outperform JPEG and JPEG 2000 on the standard Kodak benchmark.
no code implementations • 27 Jan 2022 • Ankur Mali, Alexander Ororbia, Daniel Kifer, Lee Giles
In light of this, we propose a system that learns to improve the encoding performance by enhancing its internal neural representations on both the encoder and decoder ends, an approach we call Neural JPEG.
no code implementations • 9 Dec 2021 • Hitesh Vaidya, Travis Desell, Alexander Ororbia
A lifelong learning agent is able to continually learn from potentially infinite streams of pattern sensory data.
no code implementations • Findings (EMNLP) 2021 • Diptanu Sarkar, Marcos Zampieri, Tharindu Ranasinghe, Alexander Ororbia
Transformer-based models such as BERT, XLNET, and XLM-R have achieved state-of-the-art performance across various NLP tasks including the identification of offensive language and hate speech, an important problem in social media.
no code implementations • 10 Jul 2021 • Alexander Ororbia, Ankur Mali
In humans, perceptual awareness facilitates the fast recognition and extraction of information from sensory input.
no code implementations • 15 May 2021 • Alexander Ororbia, M. A. Kelly
In this article, we present a cognitive architecture that is built from powerful yet simple neural models.
1 code implementation • SEMEVAL 2021 • Tharindu Ranasinghe, Diptanu Sarkar, Marcos Zampieri, Alexander Ororbia
In recent years, the widespread use of social media has led to an increase in the generation of toxic and offensive content on online platforms.
no code implementations • 7 Apr 2021 • Ankur Mali, Alexander Ororbia, Daniel Kifer, C. Lee Giles
Two particular tasks that test this type of reasoning are (1) mathematical equation verification, which requires determining whether trigonometric and linear algebraic statements are valid identities or not, and (2) equation completion, which entails filling in a blank within an expression to make it true.
no code implementations • 7 Dec 2020 • Alexander Ororbia, Daniel Kifer
Neural generative models can be used to learn complex probability distributions from data, to sample from them, and to produce probability density estimates.
no code implementations • 21 Nov 2020 • AbdElRahman ElSaid, Joshua Karns, Zimeng Lyu, Alexander Ororbia, Travis Desell
This work introduces a novel, nature-inspired neural architecture search (NAS) algorithm based on ant colony optimization, Continuous Ant-based Neural Topology Search (CANTS), which utilizes synthetic ants that move over a continuous search space based on the density and distribution of pheromones, is strongly inspired by how ants move in the real world.
no code implementations • 4 Apr 2020 • Ankur Mali, Alexander Ororbia, Daniel Kifer, Clyde Lee Giles
In this paper, we improve the memory-augmented RNN with important architectural and state updating mechanisms that ensure that the model learns to properly balance the use of its latent states with external memory.
no code implementations • 10 Feb 2020 • Alexander Ororbia, Ankur Mali, Daniel Kifer, C. Lee Giles
Training deep neural networks on large-scale datasets requires significant hardware resources whose costs (even on cloud platforms) put them out of reach of smaller organizations, groups, and individuals.
no code implementations • 7 Sep 2019 • Ankur Mali, Alexander Ororbia, C. Lee Giles
The NSPDA is also compared to a classical analog stack neural network pushdown automaton (NNPDA) as well as a wide array of first and second-order RNNs with and without external memory, trained using different learning algorithms.
no code implementations • 23 Aug 2019 • Alexander Ororbia
For energy-efficient computation in specialized neuromorphic hardware, we present spiking neural coding, an instantiation of a family of artificial neural models grounded in the theory of predictive coding.
no code implementations • 25 May 2019 • Alexander Ororbia, Ankur Mali, Daniel Kifer, C. Lee Giles
In lifelong learning systems based on artificial neural networks, one of the biggest obstacles is the inability to retain old knowledge as new information is encountered.
1 code implementation • 6 Feb 2019 • Alexander Ororbia, Ahmed Ahmed Elsaid, Travis Desell
This paper presents a new algorithm, Evolutionary eXploration of Augmenting Memory Models (EXAMM), which is capable of evolving recurrent neural networks (RNNs) using a wide variety of memory structures, such as Delta-RNN, GRU, LSTM, MGU and UGRNN cells.
1 code implementation • 17 Oct 2018 • Alexander Ororbia, Ankur Mali, C. Lee Giles, Daniel Kifer
We compare our model and learning procedure to other back-propagation through time alternatives (which also tend to be computationally expensive), including real-time recurrent learning, echo state networks, and unbiased online recurrent optimization.