October 10, 2012
In the popular Douglas Adams BBC radio series “The Hitchhiker’s Guide to the Galaxy,” hyper-intelligent mice paid a lot of money to create Earth, a giant computer simulation disguised as a planet. Their goal over a ten million year program was to determine the Question to Life, the Universe, and Everything. (The answer, of course, was 42.)
Although Adams meant it as fiction, the universe might not.
Everything can be modeled. This attitude is partially what drives advancements in high performance computing. Physicists are currently working on modeling the strong nuclear forces holding together the quarks and gluons that constitute neutrons and protons, also known as quantum chromodynamics.
These exact models are tiny in scale, measuring in the femtometer range (femto=10^-15). The hope is to expand that precision out to micrometers, allowing for the precise modeling of living cells. All of this will presumably made possible by the relentless advance of Moore’s Law.
So what if our universe is simply a gigantic, several-billion year old computer model?
The notion has existed as somewhat of a philosophical curiosity since the advent of computers. The idea of expanding the current model of a few femtometers to the wide ranges of the universe shouldn’t sound so ridiculous, especially for an engineer that would be older than the universe itself.
According to a team from the University of Bonn in Germany, headed by Silas Beane, cosmic ray detection may enable us to determine if we do, in fact, exist inside of a computer model. They presented their findings in a paper published earlier this month.
The primary assumption relied on by this hypothesis is that a model must utilize a three-dimensional grid, or lattice, from which point the model could be partitioned and run in parallel. This grid places certain limits on the model in that nothing can be smaller than the lattice. In this case, if the universe were the thing to be modeled, that would indicate a limit on the energy of particles.
Studying the Cosmic Microwave Background led to the discovery of such a limit, called the Greisen-Zatsepin-Kuzmin limit.
“The most striking feature of the scenario,” the paper says, “in which the lattice provides the cut off to the cosmic ray spectrum, is that the angular distribution of the highest energy components would exhibit cubic symmetry in the rest frame of the lattice.”
In essence, the GZK limit in concert with existence inside a computer model would lead to a phenomenon in physics where cosmic rays prefer a certain orientation, or angular distribution, in order to attain “symmetry in the rest frame of the lattice.”
That statement offers something testable: the angular distribution of cosmic rays. If cosmic rays exhibit some preference in orientation, that orientation could imply a modeling axis, the existence of which would be a step in determining our digitalization.
According to the paper, such a confirmation would simply be the first in a long checklist. “Of course, improvement in this context masks much of our ability to probe the possibility that our universe is a simulation.”
However, according to the researchers, the universe is finite (it may expand faster than the speed of light but it is still finite), which for the them means that the model’s volume is finite and the spaces between potential model grid lines are non-zero.
Whatever the answer to this model is, it is likely to be more complex than 42.
10/30/2013 | Cray, DDN, Mellanox, NetApp, ScaleMP, Supermicro, Xyratex | Creating data is easy… the challenge is getting it to the right place to make use of it. This paper discusses fresh solutions that can directly increase I/O efficiency, and the applications of these solutions to current, and new technology infrastructures.
10/01/2013 | IBM | A new trend is developing in the HPC space that is also affecting enterprise computing productivity with the arrival of “ultra-dense” hyper-scale servers.
Ken Claffey, SVP and General Manager at Xyratex, presents ClusterStor at the Vendor Showdown at ISC13 in Leipzig, Germany.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?