|Trend||Dataset||Best Method||Paper title||Paper||Code||Compare|
We propose a new method for learning the structure of convolutional neural networks (CNNs) that is more efficient than recent state-of-the-art methods based on reinforcement learning and evolutionary algorithms.
We present MorphNet, an approach to automate the design of neural network structures.
In our experiments, we search for the best convolutional layer (or "cell") on the CIFAR-10 dataset and then apply this cell to the ImageNet dataset by stacking together more copies of this cell, each with their own parameters to design a convolutional architecture, named "NASNet architecture".
#8 best model for Image Classification on ImageNet
Recent works have highlighted the strength of the Transformer architecture on sequence tasks while, at the same time, neural architecture search (NAS) has begun to outperform human-designed models.
#2 best model for Machine Translation on WMT2014 English-German
As the field of data science continues to grow, there will be an ever-increasing demand for tools that make machine learning accessible to non-experts.
In this paper, we propose a novel framework enabling Bayesian optimization to guide the network morphism for efficient neural architecture search.
AdaNet is a lightweight TensorFlow-based (Abadi et al., 2015) framework for automatically learning high-quality ensembles with minimal expert intervention.
This paper addresses the scalability challenge of architecture search by formulating the task in a differentiable manner.
#15 best model for Language Modelling on Penn Treebank (Word Level)
Model compression is a critical technique to efficiently deploy neural network models on mobile devices which have limited computation resources and tight power budgets.
Convolutional Neural Networks (ConvNets) are commonly developed at a fixed resource budget, and then scaled up for better accuracy if more resources are available.
SOTA for Image Classification on CIFAR-100 (using extra training data)