May 10, 2013
The private industry least likely to adopt public cloud services for data storage are financial institutions. Holding the most sensitive and heavily-regulated of data types, personal financial information, banks and similar institutions are mostly moving towards private cloud services – and doing so at great cost.
A change is underway though as services in the public cloud space are beginning to offer high-end security options that align with financial market needs. Investment banking has been the most open to adopting public clouds as that industry's ever-growing thirst for data (and the need to store it) demands lower-cost solutions. But more and more, other banking services are warming to the idea.
The old barriers to cloud adoption are evaporating. The three issues stopping banking from getting into the cloud were:
In financial markets, data is becoming more fluid and often more sharable as technologies change and competitive forces push institutions away from close-vest dealings and towards faster, more responsive near-real-time decisions. This requires faster data access, deeper analysis, and better sharing methods. The cloud is perfect for this, but the security concerns remain.
Here are ways that financial institutions are overcoming the three barriers above to facilitate the advantages of cloud storage.
Overcoming Security Barriers
Cloud providers are now often offering certain customers proof of security emplacements. The best proof is a third party test and validation followed by your own in-house experts testing the service provider's system. While first-person checking may be best for your own peace of mind, third-party verification and proof of compliance is useful for legal purposes and can shift liabilities from you to the service provider should security be breached or found inadequate by compliance audits.
Data security standards such as NIST, FFX, AES and the like offer a benchmark for security requirements, depending on your institution's needs. Many certified third-party testing companies can verify that these standards, at minimum, are met by your provider and occasional audits to re-check compliance can keep you in standing with regulators.
By leveraging data-centric approaches to the enterprise cloud stacks, data risk and compliance standards can be solved. With proper protection, and IAA (ID, Authentication, Authorization) service layer capability, data can be shared with various applications without risky exposure.
Secure Access and Use By Local Applications
Financial institutions moving to the public cloud have tackled the problem of security in storage and transmission through multi-level encryption. Data will be stored in encrypted format so that local access at the cloud provider will not give actual data access, just access to the bits and bytes that make up that data. This ensures security on that level. During transmission from the public cloud (service provider) the private network (institution), another level of encryption is added to the data. When the data arrives at the processing machine in the financial institution's secured network, it is de-encrypted on both levels and utilized, then re-encrypted before being sent back into the cloud for storage.
When coupled with the usual careful access controls, this has proven to be a highly secure way to utilize cloud services while minimizing security risks as much as possible.
Geographic Location of the Data
Often the most difficult thing for financial institutions to comply with when putting data into the cloud are data residency requirements from government. Multi-national financial services companies are often required by various jurisdictions to keep the personally-identifiable financial information of customers or clients in specific locations on the map. Similarly, some regulatory requirements will mean that data should not be stored in some locations because of potential security issues that local laws may create. For example, data stored in the U.S. is required to be open to government access upon request while the governments of Canada, France and other countries forbid open access by foreign powers – which disallows the storage of certain personal information in the United States. Likewise, data stored in Luxemborg is under heavier restrictions than most of the European Union and cannot be moved unless it is de-identified with individuals beforehand, making some transactions difficult.
Technically, the "cloud" is everywhere, but in the real-world, most cloud services are regional rather than global. To gain the full benefit of cloud storage, the storage should be geographically wide spread, but compliance issues will often forbid this. For this reason, most cloud services providers offer region-specific data storage guarantees that can be verified by third parties.
About the Author
Michael Dorf is a seasoned software architect and instructor with a M.S. in Software Engineering and a dozen years of industry experience. He is a co-founder of (LearnComputer LLC), an IT/Open Source training school based in San Francisco Bay Area. Our Big Data Overview training course is designed for IT managers who need a fast track to Big Data solutions available on the market today.
10/30/2013 | Cray, DDN, Mellanox, NetApp, ScaleMP, Supermicro, Xyratex | Creating data is easy… the challenge is getting it to the right place to make use of it. This paper discusses fresh solutions that can directly increase I/O efficiency, and the applications of these solutions to current, and new technology infrastructures.
10/01/2013 | IBM | A new trend is developing in the HPC space that is also affecting enterprise computing productivity with the arrival of “ultra-dense” hyper-scale servers.
Ken Claffey, SVP and General Manager at Xyratex, presents ClusterStor at the Vendor Showdown at ISC13 in Leipzig, Germany.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?