June 26, 2013
The hunt for new and useful materials got a big boost this week when Intermolecular agreed to lend its advanced combinational processing technology to the Materials Project, a materials-discovery computing project launched by Lawrence Berkeley National Lab and Massachusetts Institute of Technology (MIT). The blend of data and techniques could speed the discovery of new materials by a factor of 10, researchers say.
The Materials Project was founded in late 2011 with the goal of accelerating the discovery of novel compounds by giving materials scientists and engineers open access to supercomputer resources. It currently takes an average of 18 years to bring a new material--such as a battery compound, a fuel, or crystalline structure--from lab into commercial production using traditional techniques, the group says. However, using the power of supercomputers, researchers can now predict the properties of materials before they're even synthesized in the lab.
The pace of material-related innovation should improve now that the privately held, San Jose, California-based company Intermolecular has agreed to lend its proprietary High Productivity Combinatorial (HPC) tools and research data to the Material Project. Intermolecular's HPC involves using advanced combinatorial processing systems that allow dozens or hundreds of experiments to be conducted in parallel, as opposed to traditional sequential tests. The results are then analyzed, and work continues on the most promising results.
The use of Intermolecular's trademarked HPC approach and data will be a boon to the Materials Project's HPC (as in high performance computing) resources at the National Energy Research Scientific Computing Center (NERSC), according to Berkeley Lab scientist Kristin Persson, who is also the co-founder of the Materials Project.
"Access to high-quality experimental data is absolutely essential to benchmark high-throughput computational predictions for any application," Persson says in a story on the Berkeley Lab website. "We begin every materials discovery project with a comparison to existing data before we venture into the space of undiscovered compounds. This is the first effort to integrate private sector experimental data into the Materials Project, and could form the basis of a general methodology for integrating experimental data inputs from a wide-range of scientific and industrial sources."
Persson sees Intermolecular's data helping in two ways. First, if the values generated by the Materials Project are significantly off from the Intermolecular data for a given problem, it will tell researchers they may need to refine their methodologies and models. If the values are close, it will give researchers the confidence that they're on the right path, she says.
The Materials Project is one of several interrelated projects that fall under the umbrella of the Materials Genome Initiative. The Obama Administration's Office of Science and Technology Policy launched the Materials Genome Initiative in 2011 to foster cooperation between industry and academic researchers, with the goal of doubling the pace of development of advanced materials, such safer and more fuel efficient vehicles, packaging that keeps food fresher and more nutritious, and vests that better protect soldiers.
10/30/2013 | Cray, DDN, Mellanox, NetApp, ScaleMP, Supermicro, Xyratex | Creating data is easy… the challenge is getting it to the right place to make use of it. This paper discusses fresh solutions that can directly increase I/O efficiency, and the applications of these solutions to current, and new technology infrastructures.
10/01/2013 | IBM | A new trend is developing in the HPC space that is also affecting enterprise computing productivity with the arrival of “ultra-dense” hyper-scale servers.
Ken Claffey, SVP and General Manager at Xyratex, presents ClusterStor at the Vendor Showdown at ISC13 in Leipzig, Germany.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?