Search Results for author: Fabian Chudak

Found 5 papers, 2 papers with code

A General-Purpose Transferable Predictor for Neural Architecture Search

no code implementations21 Feb 2023 Fred X. Han, Keith G. Mills, Fabian Chudak, Parsa Riahi, Mohammad Salameh, Jialin Zhang, Wei Lu, Shangling Jui, Di Niu

In this paper, we propose a general-purpose neural predictor for NAS that can transfer across search spaces, by representing any given candidate Convolutional Neural Network (CNN) with a Computation Graph (CG) that consists of primitive operators.

Contrastive Learning Graph Representation Learning +1

GENNAPE: Towards Generalized Neural Architecture Performance Estimators

1 code implementation30 Nov 2022 Keith G. Mills, Fred X. Han, Jialin Zhang, Fabian Chudak, Ali Safari Mamaghani, Mohammad Salameh, Wei Lu, Shangling Jui, Di Niu

In this paper, we propose GENNAPE, a Generalized Neural Architecture Performance Estimator, which is pretrained on open neural architecture benchmarks, and aims to generalize to completely unseen architectures through combined innovations in network representation, contrastive pretraining, and fuzzy clustering-based predictor ensemble.

Contrastive Learning Image Classification +1

Profiling Neural Blocks and Design Spaces for Mobile Neural Architecture Search

1 code implementation25 Sep 2021 Keith G. Mills, Fred X. Han, Jialin Zhang, SEYED SAEED CHANGIZ REZAEI, Fabian Chudak, Wei Lu, Shuo Lian, Shangling Jui, Di Niu

Neural architecture search automates neural network design and has achieved state-of-the-art results in many deep learning applications.

Neural Architecture Search

Benchmarking Quantum Hardware for Training of Fully Visible Boltzmann Machines

no code implementations14 Nov 2016 Dmytro Korenkevych, Yanbo Xue, Zhengbing Bian, Fabian Chudak, William G. Macready, Jason Rolfe, Evgeny Andriyash

We argue that this relates to the fact that we are training a quantum rather than classical Boltzmann distribution in this case.

Benchmarking

Cannot find the paper you are looking for? You can Submit a new open access paper.