Write a Blog >>
ESEC/FSE 2021
Thu 19 - Sat 28 August 2021 Clowdr Platform
Tue 24 Aug 2021 14:25 - 14:50 - Doctoral Symposium: Slot 2
Wed 25 Aug 2021 02:25 - 02:50 - Doctoral Symposium: Slot 2

Deep neural network (DNN) has become the leading technology to realize Artificial Intelligence (AI). As DNN models become larger and more complex, so do datasets. Being able to efficiently train DNNs in parallel has become a crucial need. Data Parallelism (DP) is the widest-used solution today to accelerate DNN training but could be inefficient when processing DNNs with large-size parameters. Hybrid Parallelism (HP), which applies different parallel strategies on different parts of DNNs, is more efficient but requires advanced configurations. Not all AI researchers are experts in parallel computing, thus automating the configuration of HP strategies is very desirable for all AI frameworks. We propose a parallel semantics analysis method, which can analyze the trade-offs among different kinds of parallelisms and systematically choose the HP strategies with good training time performance. We demonstrated experimentally 260% speedup when applying our method compared to using a conventional DP approach. With our proposal, AI researchers would be able to focus more on AI algorithm research without being disturbed by parallel analysis and engineering concerns.

Tue 24 Aug

Displayed time zone: Athens change

14:00 - 15:40
Doctoral Symposium: Slot 2Doctoral Symposium +12h
14:00
25m
Paper
Unveiling Multiple Facets of Design Degradation in Modern Code Review
Doctoral Symposium
DOI
14:25
25m
Paper
Freeing Hybrid Distributed AI Training Configuration
Doctoral Symposium
Haoran Wang Huawei; University of Orléans
DOI
14:50
25m
Paper
Towards an Approach for Resource-Driven Adaptation
Doctoral Symposium
Paul A. Akiki Open University
DOI
15:15
25m
Paper
Deployment Coordination for Cross-Functional DevOps Teams
Doctoral Symposium
Daniel Sokolowski TU Darmstadt
DOI Pre-print

Wed 25 Aug

Displayed time zone: Athens change

02:00 - 03:40
Doctoral Symposium: Slot 2Doctoral Symposium
02:00
25m
Paper
Unveiling Multiple Facets of Design Degradation in Modern Code Review
Doctoral Symposium
DOI
02:25
25m
Paper
Freeing Hybrid Distributed AI Training Configuration
Doctoral Symposium
Haoran Wang Huawei; University of Orléans
DOI
02:50
25m
Paper
Towards an Approach for Resource-Driven Adaptation
Doctoral Symposium
Paul A. Akiki Open University
DOI
03:15
25m
Paper
Deployment Coordination for Cross-Functional DevOps Teams
Doctoral Symposium
Daniel Sokolowski TU Darmstadt
DOI Pre-print