In Cooperation:
2017 International Workshop on Software Engineering for High Performance Computing in Computational and Data-Enabled Science and Engineering
November 12, 2017
Held in Conjunction with
SC17
Part of the Software Engineering for Science Workshop Series
Home
Call for Papers
Committee
Schedule
Deadline Extended to August 20 AOE
NEWS: Travel Grants Available - contact Jeff Carver (carver@cs.ua.edu)
This workshop is concerned with identifying and applying appropriate software engineering (SE) tools and practices (e.g., code generators, static analyzers, validation + verification (V&V) practices, testing, design approaches, and maintenance practices) to support and ease the development of reproducible Computational and Data-enabled Science & Engineering (CoDeSE) software for High Performance Computing (HPC). Specifically:
Previous editions of this workshop have focused discussion around a number of interesting topics, including: bit-by-bit vs. scientific validation, reproducibility, unique characteristics of CoDeSE software that affect software development choices, major software quality goals for CoDeSE software, crossing the communication chasm between SE and CoDeSE, measuring the impact of SE on scientific productivity, SE tools and methods needed by the CoDeSE community, and how to effectively test CoDeSE software.
Motivated by the discussion during the 2015 and 2016 workshops, in this edition of the workshop, we expand the previous workshops by continuing and extending two special focus areas, and emphasizing data-enabled science and engineering as a partner of computational science and engineering, turning CSE into CoDeSE. First, we will place special emphasis on experience reports (including positive, negative, and neutral) of applying software engineering practices to the development of HPC scientific software. It is important to document those successes and failures for the community. Second, as quality assurance is a challenge in the scientific HPC domain, which was specifically discussed in 2016, we will also recruit papers describing quality assurance techniques for HPC science and their use in practice focussing specifically on the challenges of unit testing, system testing, and continuous integration for HPC codes, addressing both legacy code and testing at scale on different architectures and platforms.
For more information contact Jeffrey Carver.
Last Updated on May 23, 2017 by Jeffrey Carver
NEWS: Travel Grants Available - contact Jeff Carver (carver@cs.ua.edu)
Overview
This workshop is concerned with identifying and applying appropriate software engineering (SE) tools and practices (e.g., code generators, static analyzers, validation + verification (V&V) practices, testing, design approaches, and maintenance practices) to support and ease the development of reproducible Computational and Data-enabled Science & Engineering (CoDeSE) software for High Performance Computing (HPC). Specifically:
- CoDeSE applications that include large parallel models/simulations of the physical world running on HPC systems.
- CoDeSE applications that utilize HPC systems (e.g., GPUs computing, compute clusters, or supercomputers) to manage and/or manipulate large amounts of data.
- Requirements:
- Risks due to the exploration of relatively unknown scientific/engineering phenomena;
- Supporting reproducible science, particularly on non-deterministic systems;
- Constant change as new information is gathered;
- Design
- Data dependencies within the software;
- The need to identify the most appropriate parallelization strategy for CoDeSE algorithms;
- The presence of complex communication among HPC nodes that could degrade performance;
- Challenges in designing unit and system tests at appropriate scales;
- The need for fault tolerance and task migration mechanisms to mitigate the need to restart time-consuming computations due to software or hardware errors;
- V&V
- Results are often unknown when exploring novel science or engineering areas, algorithms, and datasets;
- Challenges in applying unit and system tests at appropriate scales;
- Challenges in retrospectively designing and implementing tests for legacy code;
- Popular tools often do not work on the latest HPC architectures; they need to be tuned to handle many threads executing at the same time.
- Deployment
- Failure of components within running systems is expected due to system size;
- Continuous integration on platforms with high available and infrequent downtimes;
- Long system lifespans necessitate porting across multiple platforms
Previous editions of this workshop have focused discussion around a number of interesting topics, including: bit-by-bit vs. scientific validation, reproducibility, unique characteristics of CoDeSE software that affect software development choices, major software quality goals for CoDeSE software, crossing the communication chasm between SE and CoDeSE, measuring the impact of SE on scientific productivity, SE tools and methods needed by the CoDeSE community, and how to effectively test CoDeSE software.
Motivated by the discussion during the 2015 and 2016 workshops, in this edition of the workshop, we expand the previous workshops by continuing and extending two special focus areas, and emphasizing data-enabled science and engineering as a partner of computational science and engineering, turning CSE into CoDeSE. First, we will place special emphasis on experience reports (including positive, negative, and neutral) of applying software engineering practices to the development of HPC scientific software. It is important to document those successes and failures for the community. Second, as quality assurance is a challenge in the scientific HPC domain, which was specifically discussed in 2016, we will also recruit papers describing quality assurance techniques for HPC science and their use in practice focussing specifically on the challenges of unit testing, system testing, and continuous integration for HPC codes, addressing both legacy code and testing at scale on different architectures and platforms.
For more information contact Jeffrey Carver.
Last Updated on May 23, 2017 by Jeffrey Carver