August 23, 2013
Cancer statistics. Medicare fraud. Peanut recalls. Federal government agencies collect data on all types of phenomenon, and just about every piece of data is useful to somebody, somewhere, at some time. More of that data will find its way into the hands of the public and entrepreneurs thanks to the government’s new data sharing mandate, and this month, the government issued new guidelines telling federal agencies how to comply with the mandate.
In May, the White House issued an executive that requires government agencies to share data, with the idea that data is a national asset and will help fuel innovation, economic growth, and government efficiency. At the same time, the government adopted a new Open Data Policy that required all newly generated government data to be shared with the public, in machine-readable format, such as CSV or TEXT files.
This month, the White House’s Office of Science and Technology Policy provided guidance to help the agencies do exactly that. The assistance from the OSTP takes several forms, including: directions on how agencies can inventory and publish data; a set of FAQs on how the new policy affects the federal acquisition and grant-making process; a framework for creating measurable goals that agencies can use to track their progress; and a set of free tools, case studies, and other resources that agencies can download from the Project Open Data website at http://project-open-data.github.io.
According to the OSTP, federal agencies must, by November 13, have taken the following actions: create and maintain an “enterprise data inventory;” create and maintain a “public data listing;”
create a process to engage with customers to help facilitate and prioritize data release; document if data cannot be released; and clarify roles and responsibilities for promoting efficient and effective data release.
The new Open Data Policy will benefit Americans and the economy, explained two government officials--Nick Sinai, U.S. Deputy CTO of the OSTP, and Dominic Sale, Supervisory Policy Analyst of the Office of Management and Budget--in a letter posted this month to the OSTP website.
“Opening up a wide range of government data means more entrepreneurs and companies using those data to create tools that help Americans find the right health care provider, identify a college that provides good value, find a safe place to live, and much more,” they wrote. “It also empowers decision makers within government, giving them access to more information to enable smarter, data-driven decisions.”
The government has shared more than 63,000 datasets in the last 12 months, including more than 30,000 from the Department of Commerce and more than 20,000 from the Department of the Interior alone, according to data.gov, a clearinghouse of government data. The government is putting together a new, cleaner looking and easier to use data-sharing website, which you can preview at next.data.gov.
10/30/2013 | Cray, DDN, Mellanox, NetApp, ScaleMP, Supermicro, Xyratex | Creating data is easy… the challenge is getting it to the right place to make use of it. This paper discusses fresh solutions that can directly increase I/O efficiency, and the applications of these solutions to current, and new technology infrastructures.
10/01/2013 | IBM | A new trend is developing in the HPC space that is also affecting enterprise computing productivity with the arrival of “ultra-dense” hyper-scale servers.
Ken Claffey, SVP and General Manager at Xyratex, presents ClusterStor at the Vendor Showdown at ISC13 in Leipzig, Germany.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?