September 12, 2013
In this age of big data, would it surprise you to learn that supercomputers are on track to predicting wars, revolutions and other societal disruptions? Data scientist Kalev Leetaru is one of the foremost proponents in the emerging field of predictive supercomputing. His research helped usher in the era of "petascale humanities," where computers can identify useful or interesting patterns if provided with sufficiently large data repositories.
Yahoo Fellow in Residence at Georgetown University in Washington, DC, and formerly affiliated with the Institute for Computing in the Humanities, Arts and Social Science at the University of Illinois, Leetaru amassed a collection of over one hundred million articles from media outlets around the world, spanning 30 years, with each item translated and tagged for geography and tone. Leetaru analyzed the data with a shared memory supercomputer called Nautilus, creating a network with 10 billion items connected by one hundred trillion semantic relationships.
The 30-year worldwide news archive was part of a 2011 study called Culturomics 2.0: Forecasting large–scale human behavior using global news media tone in time and space. The findings were impressive, pointing to a degree of predictive ability, greater than chance would account for. The events that could be predicted include the revolutions in Tunisia, Egypt, and Libya, including the removal of Egyptian President Mubarak. The corpus also correctly anticipated a period of stability for Saudi Arabia.
Leetaru takes this to mean that it's possible to predict major upheavals, like the Arab Spring, with some degree of confidence.
"It's like a weather forecast," he says in a recent Kernel article. "A 70 per cent chance of rain tomorrow means that it might not rain, but it's probably worth bringing an umbrella, because the conditions for rain are there."
So far, all of these "predictions" have taken place after the event has already occurred, but that's exactly how other forecasting models are vetted for accuracy. The real test will be anticipating events that hadn't yet happened.
From the Kernel article: "Leetaru believes we should give supercomputers a chance at predicting global conflicts. Why? Because humans are even worse at predicting major world events than computers. The first people defeated in the uprisings of the Arab world weren't dictators, but political scientists all over the world. None of them had seen it coming."
But even if they had seen it coming, what then? The implications are mind-boggling, not to mention sci-fi movie scary, harkening back to the 2002 Tom Cruise film Minority Report. Two questions come to mind: What if we knew someone was going to commit a crime, what would we do with that knowledge, and what percent of certainty would a prediction require to be actionable?
10/30/2013 | Cray, DDN, Mellanox, NetApp, ScaleMP, Supermicro, Xyratex | Creating data is easy… the challenge is getting it to the right place to make use of it. This paper discusses fresh solutions that can directly increase I/O efficiency, and the applications of these solutions to current, and new technology infrastructures.
10/01/2013 | IBM | A new trend is developing in the HPC space that is also affecting enterprise computing productivity with the arrival of “ultra-dense” hyper-scale servers.
Ken Claffey, SVP and General Manager at Xyratex, presents ClusterStor at the Vendor Showdown at ISC13 in Leipzig, Germany.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?