July 15, 2011
The use of supercomputing to help maintain the US nuclear weapons arsenal is one of the more specialized applications of high performance computing. Simulating the behavior of these devices inside a computer has allowed the US to adhere to the Comprehensive Test Ban Treaty (CTBT), while maintaining some confidence that the country's nuclear deterrence capabilities remain intact. The responsibility to support our nuclear arsenal virtually has fallen on the NNSA's Stockpile Stewardship Program, under the Department of Energy.
But the ability of these supercomputing models to be able to replace actual nuclear testing is still somewhat controversial. A report by Chris Schneidmiller at Global Security Newswire weighs some of pros and cons of physical versus simulated nuclear testing and the ramifications of our CTBT obligations. In particular, Schneidmiller begins by pointing out that skeptics believe that "computer modeling cannot effectively replace actual testing in terms of ensuring the upkeep of today's stockpile, nor for preparing new nuclear weapons that might one day be necessary to safeguard the United States from future threats."
In addition new types of weapons might need to be developed to counter new types of threats. The Bush administration's proposal for the so-called "bunker busting" nuke is one such example. Having to develop an entirely new bomb without ever being able to detonate it is problematic at best.
The problem is that without some sort of physical testing, there is no assurance that the real-world behavior of the weapons is being reflected in computer model. As former Defense Secretary Caspar Weinberger pointed out, the confidence that the weapons will work is the whole basis of our nuclear deterrence strategy. And the only way to demonstrate that is to test the devices.
Of course, the whole idea behind the Stockpile Stewardship Program is to demonstrate that confidence without the testing. According to Undersecretary of State for Arms Control and International Security Ellen Tauscher, the directors of national labs maintain that the program has "provided a deeper understanding of our arsenal than they ever had when testing was commonplace."
A 2002 study from the National Academy of Sciences concluded that the US nuclear stockpile could indeed be maintained, given enough computing power and other technical resources. Particularly in the 1990s, whether supercomputers were capable of accurately simulating these weapon systems was an open question. Today, with petascale machines available, there is less concern about capability.
In March at the Carnegie International Nuclear Policy Conference, CTBT opponent Senator Jon Kyl said that Stockpile Stewardship Program offered "both good news and bad news" regarding our nuclear arsenal, but expressed reservations that the program was the ultimate answer to maintaining our nuclear deterrence.
Full story at Global Security Newswire
10/30/2013 | Cray, DDN, Mellanox, NetApp, ScaleMP, Supermicro, Xyratex | Creating data is easy… the challenge is getting it to the right place to make use of it. This paper discusses fresh solutions that can directly increase I/O efficiency, and the applications of these solutions to current, and new technology infrastructures.
10/01/2013 | IBM | A new trend is developing in the HPC space that is also affecting enterprise computing productivity with the arrival of “ultra-dense” hyper-scale servers.
Ken Claffey, SVP and General Manager at Xyratex, presents ClusterStor at the Vendor Showdown at ISC13 in Leipzig, Germany.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?