June 28, 2013
The United States is implementing a new “Cloud-First” computing strategy, in which they will start transferring applications from private datacenters to hybrid and public infrastructures. In the effort to reduce spending across the board, the government is taking steps to cut computing costs by moving as many computations and applications to the cloud as possible.
Indeed, as the report noted, “the goal is to accelerate the Federal government’s adoption of secure and effective cloud computing solutions to reduce costs and improve services.”
Of course, for that to be possible, the United States government will need to ensure maximum security for those cloud applications. The National Institute of Standards and Technology released their security reference architecture for cloud computing, setting an important baseline for cloud-based systems in the United States.
“The approach to securing a cloud Ecosystem is intrinsically related to the cloud computing service model (SaaS, PaaS, or IaaS) and to the deployment model (Public, Private, Hybrid, or Community) that best fits the Consumer’s business missions and security requirements.” In the figure below, NIST identified the responsibilities of the users relative to deploying a cloud system, focusing on specific standards for each service level.
Indeed they presented their architecture largely in the context of the different deployment models IaaS, PaaS, and SaaS. “Understanding the relationships and interdependencies between the different cloud computing deployment models and service models is critical to understanding the security risks involved in cloud computing.”The below figure diagrams in detail how the deployment models relate to each other.
As can be seen by the image above, the SaaS, IaaS, and PaaS deployment models are covered under a service layer which itself lies within the usual orchestration setup that generally exists to regulate the software with the infrastructure.
In general, the government through NIST is looking to establish a risk-based approach regarding applications in the cloud. Here, risk-based involves developing an analytics system that utilizes risk analysis algorithms to determine the roles of the service providers, as explained below.
“The NCC-SRA introduces a risk-based approach to determine each cloud Actor’s responsibility for implementing specific controls throughout the life cycle of the cloud Ecosystem. Specifically, for each instance of the cloud Ecosystem, the security components are analyzed to identify the level of involvement of each cloud Actor in implementing those components.” That risk determination would happen through what the document referred to as a cloud broker, the full impact of which can be seen in the figure below.
In the document, NIST recognized that ensuring security across cloud platforms would be a different animal doing so for traditional IT structures. Cloud computing will nearly always involve at least one more variable in an organization that runs and operates the datacenters. This is of course on top of the various vendors providing the IaaS, SaaS and PaaS systems.
“A Consumer’s ability to comply with any business, regulatory, operational, or security requirements in a cloud computing environment is a direct result of the service and deployment model adopted by the agency, the cloud architecture, and the deployment and management of the resources in the cloud environment.”
According to the document, the NIST strategy is two-fold. The first is placing a high priority on establishing security, interoperability, and portability requirements while the second is to develop establish relationships with standards bureaus and leading cloud technology vendors in the private sector. Drawing upon the experience of those running applications in the cloud and keeping their data secure in the process, such as research organizations both public and private, will hopefully in NIST’s eyes prove advantageous in coming up with cloud tech standards.
Once that happens, it will be up to users to adapt to the standard set by the initiative. “For each use case of data migrated to the cloud, it is necessary for the Consumer to evaluate the particular security requirements in the specific cloud architectural context, and to map them to proper security controls and practices in technical, operational, and management classes.”
This cloud-first initiative, and a potentially unified security reference architecture could have significant implications for HPC applications running in the cloud. The US government’s main purpose here is to cut costs. The steps they take in accomplishing both that and still running numerous applications, many of which will be of a high performance nature, will further the field’s visibility and sustainability.
10/30/2013 | Cray, DDN, Mellanox, NetApp, ScaleMP, Supermicro, Xyratex | Creating data is easy… the challenge is getting it to the right place to make use of it. This paper discusses fresh solutions that can directly increase I/O efficiency, and the applications of these solutions to current, and new technology infrastructures.
10/01/2013 | IBM | A new trend is developing in the HPC space that is also affecting enterprise computing productivity with the arrival of “ultra-dense” hyper-scale servers.
Ken Claffey, SVP and General Manager at Xyratex, presents ClusterStor at the Vendor Showdown at ISC13 in Leipzig, Germany.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?