Workshop: Hardware Aware Efficient Training 2021
In this workshop, we propose to focus on reducing the complexity of the training process. Our aim is to gather researchers interested in reducing energy, time, or memory usage for faster/cheaper/greener prototyping or deployment of deep learning models. Due to the dependence of deep learning on large computational capacities, the outcomes of the workshop could benefit all who deploy these solutions, including those who are not hardware specialists. Moreover, it would contribute to making deep learning more accessible to small businesses and small laboratories. Website: https://haet2021.github.io/ Organizer from ImViA: Fan Yang
- kc_data:
- a:8:{i:0;s:0:"";s:4:"mode";s:2:"kc";s:3:"css";s:0:"";s:9:"max_width";s:0:"";s:7:"classes";s:0:"";s:9:"thumbnail";s:0:"";s:9:"collapsed";s:0:"";s:9:"optimized";s:0:"";}
- kc_raw_content:
- extrait:
- lien_externe:
- equipe:
- a:1:{i:0;s:5:"CORES";}
- tags:
- workshop