Search Results for author: Mark D. McDonnell

Found 10 papers, 5 papers with code

Premonition: Using Generative Models to Preempt Future Data Changes in Continual Learning

1 code implementation12 Mar 2024 Mark D. McDonnell, Dong Gong, Ehsan Abbasnejad, Anton Van Den Hengel

We show here that the combination of a large language model and an image generation model can similarly provide useful premonitions as to how a continual learning challenge might develop over time.

Continual Learning Fine-Grained Image Classification +3

Single-bit-per-weight deep convolutional neural networks without batch-normalization layers for embedded systems

1 code implementation16 Jul 2019 Mark D. McDonnell, Hesham Mostafa, Runchun Wang, Andre van Schaik

We found, following experiments with wide residual networks applied to the ImageNet, CIFAR 10 and CIFAR 100 image classification datasets, that BN layers do not consistently offer a significant advantage.

Ranked #94 on Image Classification on CIFAR-100 (using extra training data)

General Classification Image Classification

Diagnosing Convolutional Neural Networks using their Spectral Response

no code implementations8 Oct 2018 Victor Stamatescu, Mark D. McDonnell

Convolutional Neural Networks (CNNs) are a class of artificial neural networks whose computational blocks use convolution, together with other linear and non-linear operations, to perform classification or regression.

General Classification Image Classification

Training wide residual networks for deployment using a single bit for each weight

5 code implementations ICLR 2018 Mark D. McDonnell

Using wide residual networks as our main baseline, our approach simplifies existing methods that binarize weights by applying the sign function in training; we apply scaling factors for each layer with constant unlearned values equal to the layer-specific standard deviations used for initialization.

Track Everything: Limiting Prior Knowledge in Online Multi-Object Recognition

no code implementations21 Apr 2017 Sebastien C. Wong, Victor Stamatescu, Adam Gatt, David Kearney, Ivan Lee, Mark D. McDonnell

We argue that by transferring the use of prior knowledge from the detection and tracking stages to the classification stage we can design a robust, general purpose object recognition system with the ability to detect and track a variety of object types.

General Classification Multi-Object Tracking +4

Understanding data augmentation for classification: when to warp?

no code implementations28 Sep 2016 Sebastien C. Wong, Adam Gatt, Victor Stamatescu, Mark D. McDonnell

In this paper we investigate the benefit of augmenting data with synthetically created samples when training a machine learning classifier.

Classification Data Augmentation +1

Enhanced Image Classification With a Fast-Learning Shallow Convolutional Neural Network

no code implementations16 Mar 2015 Mark D. McDonnell, Tony Vladusich

We present a neural network architecture and training method designed to enable very rapid training and low implementation complexity.

General Classification Image Classification

Fast, simple and accurate handwritten digit classification by training shallow neural network classifiers with the 'extreme learning machine' algorithm

no code implementations29 Dec 2014 Mark D. McDonnell, Migel D. Tissera, Tony Vladusich, André van Schaik, Jonathan Tapson

Our close to state-of-the-art results for MNIST and NORB suggest that the ease of use and accuracy of the ELM algorithm for designing a single-hidden-layer neural network classifier should cause it to be given greater consideration either as a standalone method for simpler problems, or as the final classification stage in deep neural networks applied to more difficult problems.

General Classification speech-recognition +1

Channel noise induced stochastic facilitation in an auditory brainstem neuron model

1 code implementation11 Nov 2013 Brett A. Schmerl, Mark D. McDonnell

This holds whether the firing dynamics in the model are phasic (SBSR can occur due to channel noise) or tonic (ISR can occur due to channel noise).

Neurons and Cognition Subcellular Processes

Cannot find the paper you are looking for? You can Submit a new open access paper.