Neural Architecture Search
1st lightweight NAS challenge and moving beyond

Accepted papers:

■    Improving Ranking Correlation of Supernet with Candidates Enhancement and Progressive Training: 
      Ziwei Yang, Ruyi Zhang, Zhi Yang, Xubo Yang, Lei Wang and Zheyang Li [PDF]
■    One-Shot Neural Channel Search: WhatWorks and What’s Next: Chaoyu Guan, Yijian Qin, Zhikun Wei, Zeyang Zhang, 
      Zizhao Zhang, Xin Wang, and Wenwu Zhu [PDF]
■    Semi-Supervised Accuracy Predictor: SemiLGB:Hai Li, Yang Li and Zhengrong Zhuo [PDF]
   Cascade Bagging for Accuracy Prediction with Few Training Samples:Ruyi Zhang,Ziwei Yang, Zhi Yang, Xubo Yang,   
      Lei Wang and Zheyang Li [PDF]
■    A Platform-based Framework for the NAS Performance Prediction Challenge:Haocheng Wang, Yuxin Shen, Zifeng Yu, 
      Guoming Sun, Xiaoxing Chen and Chenhan Tsai [PDF]
■    AutoAdapt: Automated Segmentation Network Search for Unsupervised Domain: Xueqing Deng, Yuxin Tian, 
      Shawn Newsam and Yi Zhu [PDF]
■    NAS-Bench-x11 and the Power of Learning Curves:Shen Yan, Colin White, Yash Savani and Frank Hutter [PDF]  
■    Bag of Tricks for Neural Architecture Search:Thomas Elsken, Benedikt Staffler, Arber Zela, Jan Hendrik Metzen 
      and Frank Hutter [PDF]
■    Group Sparsity: A Unified Framework for Network Pruning and Neural Architecture Search:
      Avraam Chatzimichailidis,Arber Zela, Shalini Shalini, Peter Labus,Janis Keuper, Frank Hutter and Yang Yang [PDF]


Paper submission:

We invite submissions of two types: workshop proceedings and extended abstracts. We invite submissions on any aspect of 
NAS and beyond for Representation Learning in Vision and Beyond. This includes, but is not  limited to:
     • Theoretical frameworks and novel objective functions for representation learning
     • Novel network architectures and training protocols
     • Adaptive multi-task and transfer learning
     • Multi-objective optimization and parameter estimation methods
     • Reproducibility in neural architecture search
     • Resource constrained architecture search
     • Automatic data augmentation and hyperparameter optimization
     • Unsupervised learning, domain transfer and life-long learning
     • Computer vision datasets and benchmarks for neural architecture search
     • Searching algorithms  and evaluation strategies of  neural architecture search
     • Consistency issue of Oneshot-NAS
     • Probabilistic based neural architecture search
     • Search space design of neural architecture search

Important Dates
For workshop proceedings (4-8 pages excluding references), 
    • Paper Submission Deadline: April 15, 2021 (11:59 p.m. PST)
    • Notification to Authors: April 17, 2021 (11:59 p.m. PST)
    • Camera-ready Paper Deadline: April 20, 2021 (11:59 p.m. PST)
    • Submission Guidelines: The submissions should follow the same policy as the main conference
For extended abstracts (4 pages including references),
    • Paper Submission Deadline: May 25, 2021 (11:59 p.m. PST)
    • Notification to Authors: June 1, 2021 (11:59 p.m. PST)
    • Camera-ready Paper Deadline: June 6, 2021 (11:59 p.m. PST)
    • Submission Guidelines: We solicit short papers in the length of 4 pages (including references) and accepted papers will be linked 
      online at the workshop webpage. Submitted works can be shorter versions of work presented at the main conference or 
      work in progress on relevant topics of the workshop. Each paper accepted to the workshop will be allocated either a contributed 
      talk or a poster presentation and one paper will be awarded as the best paper, recommended during the peer review period by the 
      workshop program chairs.

Manuscripts should follow the CVPR 2021 paper template and should be submitted through the CMT link below.  
    • Paper submission Link:  https://cmt3.research.microsoft.com/NAS2021/
    • Review process: Single-blind (i.e., submissions need not be anonymized)
    • Supplementary Materials: Authors can optionally submit supplemental materials for the paper via CMT. 

本站使用百度智能门户搭建 管理登录