You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
针对评估方法,传统的 NAS 算法是 full training,浪费算力,加上 Early Stopping 之后可以做到约等于 partial training 的效果。 9102 年的 NAS 算法一般都会通过某些方式,诸如权重共享(ENAS),Network Morphism(Auto Keras),超图(DARTS,ENAS)的方式来共享一些参数来加速训练,降低算力要求。
NAS 现在复杂度很高,像超图,就为训练引入了新的复杂度。 Network morphisms require architecture transformations that satisfy certain criteria。weight sharing 简单易用而且实现不复杂,作者最终采取了随机搜索加早期停止加权重共享实现了一个 baseline 算法。
https://arxiv.org/abs/1902.07638
The text was updated successfully, but these errors were encountered: