Wed 25 Aug 2021 02:25 - 02:50 - Doctoral Symposium: Slot 2
Deep neural network (DNN) has become the leading technology to realize Artificial Intelligence (AI). As DNN models become larger and more complex, so do datasets. Being able to efficiently train DNNs in parallel has become a crucial need. Data Parallelism (DP) is the widest-used solution today to accelerate DNN training but could be inefficient when processing DNNs with large-size parameters. Hybrid Parallelism (HP), which applies different parallel strategies on different parts of DNNs, is more efficient but requires advanced configurations. Not all AI researchers are experts in parallel computing, thus automating the configuration of HP strategies is very desirable for all AI frameworks. We propose a parallel semantics analysis method, which can analyze the trade-offs among different kinds of parallelisms and systematically choose the HP strategies with good training time performance. We demonstrated experimentally 260% speedup when applying our method compared to using a conventional DP approach. With our proposal, AI researchers would be able to focus more on AI algorithm research without being disturbed by parallel analysis and engineering concerns.
Tue 24 AugDisplayed time zone: Athens change
14:00 - 15:40 | |||
14:00 25mPaper | Unveiling Multiple Facets of Design Degradation in Modern Code Review Doctoral Symposium Anderson Uchôa PUC-Rio DOI | ||
14:25 25mPaper | Freeing Hybrid Distributed AI Training Configuration Doctoral Symposium Haoran Wang Huawei; University of Orléans DOI | ||
14:50 25mPaper | Towards an Approach for Resource-Driven Adaptation Doctoral Symposium Paul A. Akiki Open University DOI | ||
15:15 25mPaper | Deployment Coordination for Cross-Functional DevOps Teams Doctoral Symposium Daniel Sokolowski TU Darmstadt DOI Pre-print |
Wed 25 AugDisplayed time zone: Athens change
02:00 - 03:40 | |||
02:00 25mPaper | Unveiling Multiple Facets of Design Degradation in Modern Code Review Doctoral Symposium Anderson Uchôa PUC-Rio DOI | ||
02:25 25mPaper | Freeing Hybrid Distributed AI Training Configuration Doctoral Symposium Haoran Wang Huawei; University of Orléans DOI | ||
02:50 25mPaper | Towards an Approach for Resource-Driven Adaptation Doctoral Symposium Paul A. Akiki Open University DOI | ||
03:15 25mPaper | Deployment Coordination for Cross-Functional DevOps Teams Doctoral Symposium Daniel Sokolowski TU Darmstadt DOI Pre-print |