University of California, Davis Alamo, United States of America
The field of high-performance computing has long been plagued by reproducibility problems. In the early 1990s, lax standards for reporting performance led to considerable confusion and some loss of credibility for the field. Even today, the HPC field significantly lags other fields of scientific research in establishing standards for reproducible research, even for such basic practices as thoroughly documenting computer runs with algorithm statements, source code, system environment and other key details. Recently the issue of numerical reproducibility has risen to the fore, spurred both by the rapidly increasing scope and sophistication of large applications, which greatly magnify numerical sensitivities and precision requirements, as well as increased interest in machine learning and artificial intelligence, which has driven usage of half-precision and other forms of reduced-precision computing. This talk will briefly summarize current work in the field and outline challenges that lie ahead.