Write a Blog >>
ESEC/FSE 2021
Thu 19 - Sat 28 August 2021 Clowdr Platform

In large-scale online service systems, software changes are inevitable and frequent. Due to importing new code or configurations, changes are likely to incur incidents and destroy user experience. Thus it is essential for engineers to identify bad software changes, so as to reduce the influence of incidents and improve system re- liability. To better understand bad software changes, we perform the first empirical study based on large-scale real-world data from a large commercial bank. Our quantitative analyses indicate that about 50.4% of incidents are caused by bad changes, mainly be- cause of code defect, configuration error, resource contention, and software version. Besides, our qualitative analyses show that the current practice of detecting bad software changes performs not well to handle heterogeneous multi-source data involved in soft- ware changes. Based on the findings and motivation obtained from the empirical study, we propose a novel approach named SCWarn aiming to identify bad changes and produce interpretable alerts accurately and timely. The key idea of SCWarn is drawing support from multimodal learning to identify anomalies from heterogeneous multi-source data. An extensive study on two datasets with various bad software changes demonstrates our approach
significantly outperforms all the compared approaches, achieving 0.95 F1-score on average and reducing MTTD (mean time to detect) by 20.4%∼60.7%. In particular, we shared some success stories and lessons learned from the practical usage.

Wed 25 Aug

Displayed time zone: Athens change

09:00 - 10:00
Analytics & Software Evolution—Code Reviews and ChangesJournal First / Research Papers / Demonstrations / Ideas, Visions and Reflections +12h
Chair(s): Ingrid Nunes Universidade Federal do Rio Grande do Sul (UFRGS), Brazil, Anthony Cleve University of Namur
09:00
10m
Paper
Identifying Bad Software Changes via Multimodal Anomaly Detection for Online Service Systems
Research Papers
Nengwen Zhao Tsinghua University, Junjie Chen Tianjin University, Zhaoyang Yu Tsinghua University, Honglin Wang BizSeer, Jiesong Li China Guangfa Bank, Bin Qiu China Guangfa Bank, Hongyu Xu China Guangfa Bank, Wenchi Zhang BizSeer, Kaixin Sui BizSeer, Dan Pei Tsinghua University
DOI
09:10
10m
Paper
Journal First Submission of the Article: "An Empirical Investigation of Relevant Changes and Automation Needs in Modern Code Review"
Journal First
Sebastiano Panichella Zurich University of Applied Sciences, Nick Zaugg University of Zurich
09:20
5m
Paper
Exploit Those Code Reviews! Bigger Data for Deeper Learning
Demonstrations
Robert Heumüller University of Magdeburg, Sebastian Nielebock Otto-von-Guericke University Magdeburg, Frank Ortmeier University of Magdeburg
DOI Media Attached
09:25
5m
Paper
Towards Automating Code Review at Scale
Ideas, Visions and Reflections
Vincent J. Hellendoorn Carnegie Mellon University, Jason Tsay IBM Research, Manisha Mukherjee Carnegie Mellon University, Martin Hirzel IBM Research
DOI
09:30
30m
Live Q&A
Q&A (Analytics & Software Evolution—Code Reviews and Changes)
Research Papers

21:00 - 22:00
Analytics & Software Evolution—Code Reviews and ChangesResearch Papers / Demonstrations / Ideas, Visions and Reflections / Journal First
Chair(s): Emad Aghajani Software Institute, USI Università della Svizzera italiana
21:00
10m
Paper
Identifying Bad Software Changes via Multimodal Anomaly Detection for Online Service Systems
Research Papers
Nengwen Zhao Tsinghua University, Junjie Chen Tianjin University, Zhaoyang Yu Tsinghua University, Honglin Wang BizSeer, Jiesong Li China Guangfa Bank, Bin Qiu China Guangfa Bank, Hongyu Xu China Guangfa Bank, Wenchi Zhang BizSeer, Kaixin Sui BizSeer, Dan Pei Tsinghua University
DOI
21:10
10m
Paper
Journal First Submission of the Article: "An Empirical Investigation of Relevant Changes and Automation Needs in Modern Code Review"
Journal First
Sebastiano Panichella Zurich University of Applied Sciences, Nick Zaugg University of Zurich
21:20
5m
Paper
Exploit Those Code Reviews! Bigger Data for Deeper Learning
Demonstrations
Robert Heumüller University of Magdeburg, Sebastian Nielebock Otto-von-Guericke University Magdeburg, Frank Ortmeier University of Magdeburg
DOI Media Attached
21:25
5m
Paper
Towards Automating Code Review at Scale
Ideas, Visions and Reflections
Vincent J. Hellendoorn Carnegie Mellon University, Jason Tsay IBM Research, Manisha Mukherjee Carnegie Mellon University, Martin Hirzel IBM Research
DOI
21:30
30m
Live Q&A
Q&A (Analytics & Software Evolution—Code Reviews and Changes)
Research Papers