October 18, 2011
Electromagnetic radiation is pervasive. We are surrounded by a wealth of appliances, mobile devices and their signals, not to mention a number of other technologies that emit microwaves that pass through our bodies.
Major questions have emerged in recent years about the effect of this radiation on the human body—not the least of which is the question of whether brain and other cancers might not be linked to the use of cell phones and other microwave-radiating devices. In an attempt to discover just how these microwaves interact with our bodies as they pass through us, the University of Texas at Austin has been tasked with a five-year interdisciplinary study that uses one of the highest-resolution electromagnetic human models to date.
This human model, called AustinMan, is helping scientists understand in great detail what happens to body tissues when they encounter microwaves, particularly those from mobile devices. The scientists behind the AustinMan modeling project claim that their method is superior to the “traditional” method of gauging the impact of wireless devices on our bodies—one that relied almost exclusively on survey and statistical data to make broad generalizations about relationships between wireless devices and human health problems.
AustinMan is, as the University of Texas describes, “a publicly available model that represents the human body with one-millimeter-cubed resolution (something akin to a virtual Lego body composed of extremely small parts).” To create the AustinMan model, the group worked with anatomists to transform the image slices into computational maps of the body’s tissues. “Whereas previous models had included only a handful of tissue types, the current model contains 30 types of tissues, each with unique electromagnetic properties. Overall, the model contains more than 100 million voxels (3-D versions of pixels) that interact with one another during the virtual cellphone calls.”
According to Aaron Dubrow from the Texas Advanced Computing Center (TACC), “Such extreme simulations are impossible using traditional computing methods and software. Even with the efficient algorithms that the researchers are developing, each simulation would take about five years of continuous execution on an ordinary desktop computer. Crunching the numbers on the Ranger supercomputer at TACC on the Pickle Research Campus in North Austin, however, the lead researcher and his team can perform these simulations in less than six hours.”
Dubrow writes, “During the past two years, the project has used more than 3 million computing hours on TACC’s supercomputers, the equivalent of 342 years on a single processor.”
While the team’s goal is not to explore the direct medical connections in depth, the creation of this model will allow for more in-depth studies that can shed light on the effect of the use of cellphones and other devices that are reliant on wireless signals.
Full story at University of Texas
10/30/2013 | Cray, DDN, Mellanox, NetApp, ScaleMP, Supermicro, Xyratex | Creating data is easy… the challenge is getting it to the right place to make use of it. This paper discusses fresh solutions that can directly increase I/O efficiency, and the applications of these solutions to current, and new technology infrastructures.
10/01/2013 | IBM | A new trend is developing in the HPC space that is also affecting enterprise computing productivity with the arrival of “ultra-dense” hyper-scale servers.
Ken Claffey, SVP and General Manager at Xyratex, presents ClusterStor at the Vendor Showdown at ISC13 in Leipzig, Germany.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?