Skip to main content


Keeping the World Safe Though Extreme Scale Data Management

CADRE 2022


Session Information

Los Alamos has had a long history in HPC from the 1940’s to present. The 1992 nuclear test ban led to the need to do full scale nuclear weapons testing using digital simulation.  These virtual tests involve extreme scale, multi-resolution, multi-physics, and multi-link scale simulations that  run on a million cores with a petabyte of in memory working data.  These simulations routinely run for as much as a year to produce a useful test result. This type of computing can generate hundreds of petabytes of data to be managed in weeks  to months. Managing the data from irreproducible nuclear tests, sub critical materials tests, and simulations consisting of a billion files and exabytes of data for 70 years has led Los Alamos to be near the cutting edge of at scale data management methods. This presentation will cover the background of why and how we simulate and concentrate on the needed data management tools that have been developed or are in development to enable scientists to gain scientific insight to use in decision support for US nuclear stockpile stewardship.



Gary Grider








Back To Top