Noisy Heuristics NAS: A Network Morphism based Neural Architecture Search using Heuristics

Suman Sapkota, Binod Bhattarai

Research output: Contribution to conferenceUnpublished paperpeer-review

Abstract

Network Morphism based Neural Architecture Search (NAS) is one of the most efficient methods, however, knowing where and when to add new neurons or remove dis-functional ones is generally left to black-box Reinforcement Learning models. In this paper, we present a new Network Morphism based NAS called Noisy Heuristics NAS which uses heuristics learned from manually
developing neural network models and inspired by biological neuronal dynamics. Firstly, we add new neurons randomly and prune away some to select only the best fitting neurons. Secondly, we control the number of layers in the network using the relationship of hidden units to the number of input-output connections. Our method can increase or decrease the capacity or non-linearity of models online which is specified with a few meta-parameters by the user. Our method generalizes both on toy datasets and on real-world data sets such as MNIST, CIFAR-10, and CIFAR100. The performance is comparable to the and engineered architecture ResNet-18 with the similar parameters.
Original languageEnglish
Number of pages11
DOIs
Publication statusPublished - 22 Jul 2022
Externally publishedYes
EventWorkshop on Dynamic Neural Networks: 2022 International Conference on Machine Learning - Baltimore, United States
Duration: 22 Jul 202222 Jul 2022
https://dynn-icml2022.github.io/index

Conference

ConferenceWorkshop on Dynamic Neural Networks
Abbreviated titleDYNN-ICML 2022
Country/TerritoryUnited States
CityBaltimore
Period22/07/2222/07/22
Internet address

Fingerprint

Dive into the research topics of 'Noisy Heuristics NAS: A Network Morphism based Neural Architecture Search using Heuristics'. Together they form a unique fingerprint.

Cite this