Publication
IJCNN 2024
Conference paper

DFUSENAS: A Diffusion-Based Neural Architecture Search

Abstract

Deep Learning (DL) has revolutionized numerous domains by crafting highly effective Neural Network (NN) architecture. However, manual engineering approaches for NN design often yield sub-optimal solutions. Neural Architecture Search (NAS) addresses this issue by automating the design process and discovering state-of-the-art architectures. Nevertheless, the exploration of vast search spaces in NAS remains a challenging endeavor. Diffusion models have proven effective in traversing expansive search spaces encountered in generative image tasks. Their innate ability to compress and explore these spaces has shown a lot of promise. Building upon this inspiration, we introduce DfuseNAS, a novel NAS methodology rooted in diffusion processes. DfuseNAS brings substantial improvements in both NAS search efficiency and the quality of the generated neural network architectures. To the best of our knowledge, our work marks a pioneering effort in applying diffusion algorithms to enhance the search space exploration with NAS. Our experimental results, conducted on the widely used NAS-Bench-101, showcase the remarkable capabilities of DfuseNAS. We achieved the highest average accuracy, outperforming other state-of-the-art methods, while completing the search process at least 2 times faster. Moreover, when provided with a specific architecture and a given task, the application of DfuseNAS consistently led to the generation of more accurate architectures in 98% of the times.

Date

Publication

IJCNN 2024