September 25, 2013
In an era where high-performance computing is still making the transition from science to industry, there is one industrial sector that is already an established HPC player, and that's oil and gas. Over at the Cray blog, Dr. Wulf Massell, Chief Geophysicist and Energy Segment Manager at Cray, observes that while 3D seismic data processing has been a staple of the oil and gas industry for years, recent developments in the field "have added additional layers of complexity and opportunity that promise to improve exploration and production operations." This new technology is called Permanent Reservoir Monitoring, or PRM, and it draws heavily from the most cutting-edge HPC and big data technologies.
Previously, the development strategy for a newly discovered field was based primarily on the original seismic data, but over the last two decades, it was discovered that performing multiple seismic surveys enabled the production company to detect and follow changes in reservoir reflectivity caused by the movement of injection and production fluids. These findings developed into the data-intensive approach known as Permanent Reservoir Monitoring.
Massell explains: "PRM is the integration of time-lapse seismic, or 4D-seismic surveys data, with the ongoing stream of production data and new well information that continuously improves the accuracy of the reservoir model. The end result is a situation in which production companies must deal with a major unstructured big data challenge that requires innovative high-performance computing models to get the job done."
PRM solutions allow the evolution of the reservoir to be tracked with unprecedented detail, providing a powerful tool to guide drilling and production decisions over the 25 to 50 year life of a new field. Massell discusses one PRM project that involved 14 seismic surveys collected over a ten-year period. "The insights gained from the continuous refinement of the reservoir model not only improved production during this period," notes Massell, "but also the original estimate of reserves in place doubled twice during the period extending the life of the field by decades."
Supercomputer-maker Cray is helping oil and gas companies with this next-generation of challenging seismic processing and reservoir simulation workloads. At the Society of Exploration Geophysicists International Exposition (SEG 2013) taking place in Houston, Texas, this week, Massell is exploring the suitability of PRM as a new use case for Cray's Hadoop and Urika appliances. Although Cray's Hadoop technology was originally developed to handle unstructured data types for Web solutions providers, it could be re-purposed to perform data warehousing, processing, and querying suitable for subsurface interpretation. And the Cray Urika appliance is a good candidate for performing Graph Analytics on PRM data, helping to discover previously undetected relationships.
"Accessing and analyzing a several years of PRM data is a tractable challenge. Carrying that task forward ten, twenty, and more years in a world where technology change is constant is a daunting challenge," remarks the SEG session briefing.
10/30/2013 | Cray, DDN, Mellanox, NetApp, ScaleMP, Supermicro, Xyratex | Creating data is easy… the challenge is getting it to the right place to make use of it. This paper discusses fresh solutions that can directly increase I/O efficiency, and the applications of these solutions to current, and new technology infrastructures.
10/01/2013 | IBM | A new trend is developing in the HPC space that is also affecting enterprise computing productivity with the arrival of “ultra-dense” hyper-scale servers.
Ken Claffey, SVP and General Manager at Xyratex, presents ClusterStor at the Vendor Showdown at ISC13 in Leipzig, Germany.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?