March 20, 2009
A couple of random items this week connected only by the inscrutable nature of research funding.
Europe Joins the Petaflop Club
It looks like Europe -- Germany in particular -- will get its first petaflop supercomputer this year with the installation of a new Blue Gene/P. The Juelich Research Center announced that it is partnering with the German Gauss Centre for Supercomputing to procure the system, which is set to boot up around the middle of the year. The 2.2 megawatt machine will be the first Blue Gene to employ water cooling technology, enabling a "91 percent reduction in air conditioning."
If the Germans had installed this last year, say before Los Alamos' Roadrunner came online, it would have been huge news. In fact, it would also have precipitated an Earth Simulator-type panic in the US HPC community. But the reality is that Europe always seems to run 12 to 18 months behind the U.S. in the deployment of top systems.
The way I see it, there's no particular reason Europe has to play catch-up to America in regard to supercomputing prowess. With a 2008 GDP of around $18.9 trillion dollars, the 27-member European Union (EU) actually has a larger capital base from which to draw than the US, which has a 2008 GDP of $14.3 trillion. And it's not like tax rates are particularly low in Europe or the citizenry doesn't support science and technology. If the EU funded PRACE -- the Partnership for Advanced Computing in Europe -- to the extent the US funds HPC in the Department of Energy, we'd see a lot more parity in supercomputing, not to mention a lot more supercomputing in general.
I'm certainly no expert on EU government policy, but the main problem appears to be that the governing bodies are only weakly centralized, so each nation tends to act more in its own self-interest than for the greater good of the EU. Off course, the EU, which was formed in 1993, is a lot younger than the US. And cultural differences in Europe are more starkly defined than in the US. Even so, if taxpayers from Mississippi and Massachusetts can build supercomputers for Tennessee and New Mexico, surely the Europeans can do the equivalent.
Hug a Scientist
While European scientists may envy their HPC-laden US counterparts, all is not joy on this side of the pond. In the New York Times this week, Stanford University professor Stephen Quake writes about what life is like for scientists working in trenches of the modern research university. Quake is a biophysicist and part-time entrepreneur whose interests "lie at the nexus of physics, biology and biotechnology."
Quake describes how much the "business" of science has become such a central facet of research these days. Instead of devoting their lives to teaching and research, professors must now dedicate a good chunk of time gathering funding for their work:
When a university hires a professor, they typically agree to provide a start-up package to support that professor's research over the first few years, after which the professor must seek external funding. This funding is needed to buy research supplies, pay stipends and tuition for graduate students, and even to support the salary of the faculty member. In fact, the university rarely pays the full salary of the professor — depending on the department, the professor must find between 25 percent and 75 percent of his or her salary from outside grants.
Quake notes that at Stanford, despite its huge endowment income and high tuition rates, the money derived from outside grants is the single largest source of funds for the university. Since grant writing is performed by professors, the faculty ends up as the de facto marketing department for the university. What does grant writing have to do with science? Not much, says Quake:
Science at its most interesting is provocative, surprising, counter-intuitive and difficult to plan — and those are very difficult values to institutionalize in an organization or bureaucracy of any size. I have seen my own grant proposals get chewed up and rejected with comments like "typically bold, but wildly ambitious," and wondered why it is wrong to be ambitious in one's research — but perhaps that is a conclusion fully consistent with science by committee.
And you thought those profs had such cushy jobs.
Posted by Michael Feldman - March 20, 2009 @ 11:27 AM, Pacific Daylight Time
Michael Feldman is the editor of HPCwire.
No Recent Blog Comments
10/30/2013 | Cray, DDN, Mellanox, NetApp, ScaleMP, Supermicro, Xyratex | Creating data is easy… the challenge is getting it to the right place to make use of it. This paper discusses fresh solutions that can directly increase I/O efficiency, and the applications of these solutions to current, and new technology infrastructures.
10/01/2013 | IBM | A new trend is developing in the HPC space that is also affecting enterprise computing productivity with the arrival of “ultra-dense” hyper-scale servers.
Ken Claffey, SVP and General Manager at Xyratex, presents ClusterStor at the Vendor Showdown at ISC13 in Leipzig, Germany.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?