October 11, 2012
LAS VEGAS, Oct. 11 — SAS High-Performance Analytics Server now supports more analytics, including text mining and optimization. And the predictive modeling capabilities of SAS High-Performance Analytics Server will now also use Hadoop Distributed File System (HDFS), the popular open source big data infrastructure.
In-memory software accelerates insights and solves complex problems involving huge volumes of structured and unstructured data; simply put, SAS High-Performance Analytics Server helps turn big data into gold.
“SAS has ramped up R&D on our high-performance analytics family,” said Jim Davis, SAS Senior Vice President and Chief Marketing Officer. “Our customers say lightning-fast analytic insights with SAS High-Performance Analytics Server provide powerful competitive advantage. These latest enhancements, including additional support for Hadoop and HDFS, make it easier to achieve value from big data.”
The updates bolster SAS’ growing high-performance product set, which includes the in-memory big data visualization of SAS Visual Analytics and increased industry-specific and horizontal analytic solutions. The latest, SAS High-Performance Marketing Optimization, was also announced.
Early users of SAS High-Performance Analytics Server report analytic computing time shrinking from days and hours to minutes, even seconds. They can analyze more data, perform more model iterations and run more sophisticated algorithms.
For example, by evaluating numerous scenarios simultaneously, a bank can spot opportunities, detect emerging issues and respond immediately to market conditions. A retailer can personalize offers on the spot, based on structured and unstructured data from sales and social media, boosting sales.
New capabilities for current users
Increased functionality for customers running SAS High-Performance Analytics Server on database appliances from Teradata or EMC Greenplum includes text data. SAS Enterprise MinerTM users can build predictive and descriptive models even faster based on large data volumes.
Organizations use SAS analytics to detect fraud, minimize risk, increase response rates for marketing campaigns and curb customer attrition. SAS High-Performance Analytics means finding fraud before claims are paid, evaluating risk more frequently, improving marketing campaigns and reaching valuable customers before they defect.
Now enabled for SAS High-Performance Analytics Server, SAS Text Miner unlocks information from extremely large document collections, including social media. Leaders can act on new opportunities more quickly, more precisely and with less risk.
The updates also enhance large-scale optimization with select high-performance capabilities in existing SAS/OR® procedures. Only SAS High-Performance Analytics Server combines true mathematical optimization - versus simple rules-based analysis and user-defined constraint modeling - with big data to make the best possible decisions.
The latest SAS High-Performance Analytic Server supports HDFS in deployments on industry-standard hardware platforms.
SAS/ACCESS® Interface to Hadoop, introduced in February, is among nearly 30 packaged solutions for data connectivity and integration between SAS and third-party data sources, including data warehouse appliances, enterprise applications, non-relational mainframe data sources and popular relational databases.
The new SAS/ACCESS Interface to Hadoop software is eagerly welcomed by companies rolling out Hadoop pilots or production deployments.
“Hadoop is the big data platform of choice for Macys.com’s analytics team, and SAS is our analytics engine that drives insights. We connect the two environments to create an analytics solution that drives business value,” said Kerem Tomak, Vice President of Marketing Analytics at Macys.com. “I’ve used SAS/ACCESS engines for other data sources in the past and, based on that, I believe SAS/ACCESS Interface to Hadoop will provide tangible benefits.”
Meanwhile, SAS graphical tools such as SAS Data Integration Server enable users to access, process and manage Hadoop data and processes from within the familiar SAS environment, alleviating problems of skills shortages and complexity associated with Hadoop. SAS augments Hadoop with world-class analytics, plus metadata, security and lineage capabilities, ensuring that Hadoop will be enterprise-ready.
SAS Data Management also mitigates the lack of mature tools for developing and managing Hadoop deployments, helping organizations gain value from Hadoop more quickly with fewer resources. Hadoop integration with SAS Information Management provides:
“EMC Greenplum has long realized Hadoop’s value and has made it a key component in the Greenplum Unified Analytics Platform (UAP), which combines the co-processing of structured and unstructured data,” said Josh Klahr, Vice President of Products at Greenplum, a division of EMC. “The enhanced SAS High-Performance Analytics Server with Hadoop support running on Greenplum delivers a superior analytics environment, allowing customers to extract insights from vast amounts of data. Greenplum will continue working with SAS to create optimized big data platforms that deliver timely results and excellent customer value.”
Rick Lower, Partnership Marketing Director at Teradata, said, “In this time of increasingly complex business scenarios and the emergence of new opportunities from a universe of big data, SAS and Teradata analytic solutions are producing valuable insight for more companies than ever. Our new Teradata 700 appliance for SAS offers pre-installed SAS in-memory analytics with advanced capabilities.”
10/30/2013 | Cray, DDN, Mellanox, NetApp, ScaleMP, Supermicro, Xyratex | Creating data is easy… the challenge is getting it to the right place to make use of it. This paper discusses fresh solutions that can directly increase I/O efficiency, and the applications of these solutions to current, and new technology infrastructures.
10/01/2013 | IBM | A new trend is developing in the HPC space that is also affecting enterprise computing productivity with the arrival of “ultra-dense” hyper-scale servers.
Ken Claffey, SVP and General Manager at Xyratex, presents ClusterStor at the Vendor Showdown at ISC13 in Leipzig, Germany.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?