January 24, 2013
KAISERSLAUTERN, Germany, Jan. 24 – The Competence Center for HPC at the Fraunhofer ITWM in Kaiserslautern announces the new major release of FhGFS.
With the first release of FhGFS in 2007, Dr. Franz-Josef Pfreundt, director of the Competence Center, promised "a flexible, robust and easy-to-use parallel file system for everybody". While these are still important aspects of FhGFS development, the new release introduces some major improvements and new features, which have been demanded by the users.
Many of the improvements result in a significant increase in performance and scalability, especially in terms of metadata operations. In benchmarks performed with the beta release, a single metadata server could create about 35.000 files per second and by using 20 metadata servers to distribute the load, it was possible to achieve file creation rates of over 500.000 operations per second. These and other benchmarking results were shown at SC12 in Salt Lake City.
FhGFS adds an important capability by introducing on-the-fly replication of file contents and metadata with this major release. With this big step towards built-in high availability FHGFS is moving towards an enterprise-grade parallel file system.
After several publicly available beta releases, the official stable version, named 2012.10-r1, can be downloaded from http://fhgfs.com. As always, the download is free of charge and Fraunhofer provides commercial support.
About Fraunhofer and the Competence Center HPC at Fraunhofer ITWM
With more than 80 research units at different locations in Germany, the Fraunhofer-Society is the largest organization for applied research in Europe.While about 30% of Fraunhofer's budget is provided by the german government, the other 70% is generated by contract research. The majority of more than 20000 staff members are qualified scientists and engineers. Fraunhofer hosts research centers and representative offices in Europe, USA, Asia and in the Middle East. The Fraunhofer Institute for Industrial Mathematics (ITWM) is located in Kaiserslautern. The main business fields are industrial mathematics and information technology, with a dedicated Competence Center, specializing on High Performance Computing.
The Fraunhofer Competence Center for HPC (CC-HPC) offers innovative HPC solutions for the industry and the HPC market. CC-HPC’s R&D is focused on HPC tools, applications for seismic imaging, complete development and execution frameworks, and server based visualization. The CC-HPC works together with industry partners to develop HPC applications and to adopt software to HPC needs. Its main HPC tools are the Fraunhofer Parallel Filesystem (FhGFS), the Global address space Programming Interface (GPI) and the GPI-Space parallel programming and execution framework.
FhGFS is the high-performance parallel file system from the Fraunhofer Competence Center for High Performance Computing. The distributed metadata architecture of FhGFS has been designed to provide the scalability and flexibility that is required to run today's most demanding HPC applications. FhGFS is provided free of charge. Commercial support is available directly from Fraunhofer or from international partners.
10/30/2013 | Cray, DDN, Mellanox, NetApp, ScaleMP, Supermicro, Xyratex | Creating data is easy… the challenge is getting it to the right place to make use of it. This paper discusses fresh solutions that can directly increase I/O efficiency, and the applications of these solutions to current, and new technology infrastructures.
10/01/2013 | IBM | A new trend is developing in the HPC space that is also affecting enterprise computing productivity with the arrival of “ultra-dense” hyper-scale servers.
Ken Claffey, SVP and General Manager at Xyratex, presents ClusterStor at the Vendor Showdown at ISC13 in Leipzig, Germany.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?