V&V and NNSA Advanced Strategic Computing at LANL
[Updated September 26]
I think the name of the program originally known as the Advanced Strategic Computing Initiative ( ASCI ) is now known as Advanced Simulation & Computing ( ASC ).
[Updated August 26]
I have uploaded an excellent summary, developed by Los Alamos, of the ASC V&V Program at LANL. It is here.
A couple of quotes.
Confidence in simulation extrapolation comes via confidence in physics & numerics models, not calibration to experimental data.
Having “good agreement” between calculations and observations is not sufficient to establish scientifically credible predictive capability.
More and more it seems that the Climate Change Community remains the only holdout among all compute-intensive enterprises relative to application of rigorous, independent V&V to computer software.
It’s been a while since I looked into the general activities in IV&V in scientific and engineering computing. Today I did a quick search over at http://www.osti.gov and got some good hits.
But first, relative to the two quotations in the original post, I failed to mention my usual mantra; Verification must always precede Validation. In the absence of Verification, Validation is not possible.
Reviewing today’s hits indicates to me that IV&V has been established to be a priority of the highest order, as it necessarily should be, in the Advanced Simulation & Computing Program (ASC) work underway at LANL (Los Alamos), LLNL (Livermore), and SNL (Sandia).
This Sandia report SAND 2008-5517 (Unlimited Release Printed January 2009) provides an extensive specification for the overall general SQA requirements. So far as I know there is no document of a similar nature for any piece of software in all the Climate Change Community.
This report, Enhanced Verification Test Suite for Physics Simulation Codes ( LA-14379 Issued: September 2008; aka LLNL-TR-411291; aka SAND2008-7813 Unlimited Release Printed April 2009 ) is an excellent report on Verification.
The report deals strictly with Verification and reiterates that Verification is:
Verification deals with mathematical correctness of the numerical algorithms in a code, while Validation deals with physical correctness of a simulation in a regime of interest. This document is about Verification.
The report open with this quotation:
In an age of spreading pseudoscience and anti-rationalism, it behooves those of us who believe in the good of science and engineering to be above reproach whenever possible. Public confidence is further eroded with every error we make… As Robert Laughlin noted in this magazine, ‘there is a serious danger of this power [of simulations] being misused, either by accident or through deliberate deception.’ Our intellectual and moral traditions will be served well by conscientious attention to verification of codes, verification of calculations, and validation, including the attention given to building new codes or modifying existing codes with specific features that enable these activities. Patrick Roache [Roa04]
where [Roa04] is this paper: Roache, P., “Building PDE Codes to be Verifiable and Validatable,” Comput. Sci. Engrng. 6, pp. 30–38 (2004).
The Introduction in this report provides excellent descriptions of the fundamental and critical necessity for Verification and the how-tos for implementing the Verification process. It should be required reading for everyone involved in production of software the results of which have the potential to influence the establishment of policies that affect the health and safety of the public. Reading the Introduction is a good way to get a firm handle on the concepts of Verification.
I would like to put all of the Introduction here, there is much good info in it, but it’s kind of long. I especially like this:
What is Verification?
Verification is the process of demonstrating that numerical solutions of the discretized algorithms in simulation software are the correct solutions of the corresponding continuum equations. Consequently, verification represents an important aspect of the development, assessment, and application of simulation software for physics and engineering. An essential element of the verification process is the quantitative analysis of simulation code performance on well-defined problems. The outcome of such analyses provides hard evidence of mathematical consistency between the mathematical statements of the physics models and their discrete analogues as implemented with numerical algorithms in the simulation codes.
and this part:
Verification can be summarized as the analysis of whether the numerical solutions of the discrete algorithms provide accurate solutions of the corresponding continuum equations. Distinct numerical schemes based on the identical continuum equations can produce radically different quantitative (and qualitative) results; therefore, while one may obtain nominally correct solutions of the discretized schemes, those results might be inaccurate solutions of the underlying continuum equations. Consequently, verification analysis constitutes a critically important aspect of the development, assessment, and application of simulation software for physics and engineering. It is important to distinguish between verification for the purpose of proving code correctness, and for providing algorithmic assessment; both activities have high value, but differ in details and tenor. An essential element of any verification process is the quantitative analysis of the simulation code performance on well-defined problems. The outcome of such analyses provides defensible evidence of mathematical consistency between the mathematical statements of the physics models and their discrete analogues as implemented with numerical algorithms in the simulation codes.