February 09, 2012
February 8 -- Dave Hart and Irfan Elahi from the National Center for Atmospheric Research ( NCAR ) will host a workshop on campus for University of Wyoming faculty interested in using the NCAR-Wyoming Supercomputing Center ( NWSC ) for research.
The workshop is from 1-3 p.m. Thursday, Feb. 23, in Room 222 of the Classroom Building. Hart is the user services manager at NCAR, and Elahi is head of the organization's supercomputer services group.
Approximately 75 million core hours per year will be reserved for UW researchers and their collaborators at the facility. Research time and use -- focused on computationally or data intensive earth system science -- will be allocated by the Wyoming-NCAR Resource Advisory Panel (WRAP).
"With these requests, UW faculty will be asking for computer time, short-term storage, data analysis and visualization, and long-term storage," says Bryan Shader, a UW professor of mathematics and special assistant to the vice president of research and economic development. "The currency of the realm is a core hour; one core hour represents the usage of one processor for one hour. For each of the other resources, there is a core-hour equivalent conversion factor."
UW faculty will have an opportunity to submit their allocation requests for a pre-review March 6. Faculty will then have until March 26 to incorporate suggestions from the pre-review to strengthen their final allocation request, he says.
"For many faculty, this will be the first time for an allocation request of this size," Shader says. "It's in their best interest to put together the best proposal for a pre-review. We want UW allocation requests to be top quality."
Because the overall time allocation of the supercomputing center for UW is finite, Shader says the WRAP evaluates allocation requests on computational experimental design, computational effectiveness, efficiency of resource use, progress from prior allocations, and broader impacts.
"This is a $30 million machine with a useable life-span of three to four years. You want to run it at the highest capacity you can," he says. "Every time you run it without much thought, you're wasting money. Researchers must carefully plan how to most wisely and effectively use the resource."
Additional information about the allocation process can be found at http://www.uwyo.edu/nwsc .
The NWSC is being developed in partnership with UW, the state of Wyoming, Cheyenne LEADS, the Wyoming Business Council, Cheyenne Light, Fuel and Power; and the University Corporation for Atmospheric Research. NCAR is sponsored by the National Science Foundation.
The NWSC will contain some of the world's most powerful supercomputers dedicated to improving scientific understanding of climate change, severe weather, air quality and other vital atmospheric science and geoscience topics. The center also will house a premier data storage and archival facility that holds irreplaceable historical climate records and other information.
It is expected that UW allocations will be awarded in May, and available for usage starting in fall 2012.
To help plan for the workshop, interested faculty are requested to send an RSVP to email@example.com by Feb. 16.
Source:University of Wyoming
10/30/2013 | Cray, DDN, Mellanox, NetApp, ScaleMP, Supermicro, Xyratex | Creating data is easy… the challenge is getting it to the right place to make use of it. This paper discusses fresh solutions that can directly increase I/O efficiency, and the applications of these solutions to current, and new technology infrastructures.
10/01/2013 | IBM | A new trend is developing in the HPC space that is also affecting enterprise computing productivity with the arrival of “ultra-dense” hyper-scale servers.
Ken Claffey, SVP and General Manager at Xyratex, presents ClusterStor at the Vendor Showdown at ISC13 in Leipzig, Germany.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?