Abstract: Differentiable Neural Architecture Search (DARTS) recently attracts a lot of research attentions because of its high efficiency. However, the competition of candidate operations in DARTS introduces high uncertainty for selecting the truly important operation, thus leading to serious performance collapse. In this work, we decrease the uncertainty of differentiable architecture search (DU-DARTS) by enforcing the distribution of architecture parameters to approach the one-hot categorical distribution and by replacing the zero operation with a gate switch. Without any extra search cost, our method achieves state-of-the-art performance with 2.32%, 16.74% and 24.1% test error on CIFAR-10, CIFAR-100 and ImageNet datasets, respectively. Moreover, DU-DARTS can robustly find an excellent architecture on NAS-Bench-1Shot1, which further demonstrates the effectiveness of our method.