Beijing national stadium.jpg Beijing, China (40031007062).jpg

Travel Statement

Authors and attendees are guaranteed that they can attend and present at ICPE 2022 virtually without traveling to Beijing in cases where the ongoing Covid 19 pandemic does imply travel restrictions or where attendees are concerned about their personal health or CO2 emissions. However, we aim at and hope for most attendees and in particular for paper authors to attend the conference physically in Beijing.

Call for papers

Software systems (e.g., smartphone apps, desktop applications, telecommunication infrastructures and enterprise systems, etc.) have strict requirements on software performance. Failing to meet these requirements may cause business losses, customer defection, brand damage, and other serious consequences. In addition to conventional functional testing, the performance of these systems must be verified through load testing or benchmarking to ensure quality service.

Load testing examines the behavior of a system by simulating hundreds or thousands of users performing tasks at the same time. Benchmarking compares the system's performance against other similar systems in the domain. The workshop is not limited by traditional load testing; it is open to any ideas of re-inventing and extending load testing, as well as any other way to ensure systems performance and resilience under load, including any kind of performance testing, resilience / reliability / high availability / stability testing, operational profile testing, stress testing, A/B and canary testing, volume testing, and chaos engineering.

Load testing and benchmarking software systems are difficult tasks that require a deep understanding of the system under test and customer behavior. Practitioners face many challenges such as tooling (choosing and implementing the testing tools), environments (software and hardware setup), and time (limited time to design, test, and analyze). Yet, little research is done in the software engineering domain concerning this topic.

Adjusting load testing to recent industry trends, such as cloud computing, agile / iterative development, continuous integration / delivery, micro-services, serverless computing, AI/ML services, and containers poses major challenges, which are not fully addressed yet.

This one-day workshop brings together software testing and software performance researchers, practitioners, and tool developers to discuss the challenges and opportunities of conducting research on load testing and benchmarking software systems. Our ultimate goal is to grow an active community around this important and practical research topic.

We solicit two tracks of submissions:

  1. Research or industry papers (maximum 4 pages)
  2. Presentation track for industry or research talks (maximum 700 words extended abstract)
Research/Industry papers should follow the standard ACM SIG proceedings format and need to be submitted electronically via EasyChair. Extended abstracts for the presentation track need to be submitted as "abstract only'' submissions via EasyChair as well. Accepted papers will be published in the ICPE 2022 Proceedings. Submissions can be research papers, position papers, case studies, or experience reports addressing issues including but not limited to the following:

Important Dates

Paper Track (research and industry papers):

Abstract submission: January 2, 2022, AOE;
Paper submission: January 9, 2022, AOE;
Author notification: February 6, 2022, AOE;
Camera-ready version: TBD;

Presentation Track:

Extended abstract submission: January 26, 2022, AOE;
Author notification: February 6, 2022, AOE;



Alexander Podelko MongoDB, USA
Heng Li Polytechnique Montréal, Canada
Nima Mahmoudi University of Alberta, Canada

Program Committee:

Program Committee will be soon finalized.

Past LTB Workshops: