October 10, 2013
The US Postal Service (USPS) is upgrading its supercomputing and big data analytics capabilities once again to boost efficiency and fight fraud. A recent article at Federal Computer Week reports that Maryland-based FedCentric Technologies has been awarded the five-year $16.7 million contract to super-size the agency's computing infrastructure.
The USPS already has an impressive 16 terabyte in-memory computing system, located in Eagan, Minn., that checks 528 million mail pieces per day against a database of 400 billion records. The big data system has been a huge success, so much so that FedCentric Technologies was selected to build four more similar supercomputing systems at the Eagan facility, further improving the quality and quantity of data coming through USPS networks.
Gerry Kolosvary, president of FedCentric Technologies, told FCW that the USPS will be able to keep close tabs on every piece of mail. In fact, individual mail pieces will be scanned up to 11 times as they go through the USPS systems. The data allows the agency to provide better service.
"There is great intelligence information coursing through USPS systems," said Kolosvary. "It allows them to be more competitive moving forward. We know what is in the mail carrier's bag the next day, and that leads to all kinds of useful decisions, like how many trucks to send out, how many people are needed for a given day. That's a lot to begin to look at."
The main theme of this upgrade seems to be efficiency and for the cash-strapped agency, even small efficiency improvements can help close revenue gaps. Although the USPS has an annual operating budget of close to $60 billion, it still operates in the red, but already there are signs that the high-tech strategy is beginning to pay off. After reporting a $16 billion loss last year, the agency narrowed the loss to $6 billion this year. A big chunk of the losses are due to fraud, or even mistakes like missing postage – the big data applications identify these issues and report anomalies to the US Postal Inspection Service for further investigation.
The new big data dynamos will perform some 2 billion scans per day in real-time, comparing package data, such as carrier and routing information, weight and size, against existing records. The current in-memory system provides the USPS with new levels of "visibility, reporting, sortation, fraud detection and deterrence," according to Kolosvary, and the upgrade will boost these abilities even more.
10/30/2013 | Cray, DDN, Mellanox, NetApp, ScaleMP, Supermicro, Xyratex | Creating data is easy… the challenge is getting it to the right place to make use of it. This paper discusses fresh solutions that can directly increase I/O efficiency, and the applications of these solutions to current, and new technology infrastructures.
10/01/2013 | IBM | A new trend is developing in the HPC space that is also affecting enterprise computing productivity with the arrival of “ultra-dense” hyper-scale servers.
Ken Claffey, SVP and General Manager at Xyratex, presents ClusterStor at the Vendor Showdown at ISC13 in Leipzig, Germany.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?