Bill Mannel, Vice President of Product Management at SGI discusses the significant advances in decision support that are enabled by server platforms, like SGI® Altix® UV, which offer dramatically improved data analysis performance on open, industry standard architecture.
Many firms today are drowning in data, looking for ways to extract actionable intelligence from the massive data-flows in sales, R&D and operations systems. Until recently, computing platforms that could handle these datasets were hampered by limited performance and the high cost of proprietary hardware and software. New technology is breaking these barriers with the introduction of scalable systems based on standard processors and open source operating systems. Running off-the-shelf software and decision support applications, these systems dramatically accelerate business intelligence workflow while keeping costs down.
Advanced Data Analysis Capability Drives Competitive Edge
The ultimate goal of corporate decision-support efforts is real-time decision intelligence for better reaction to dynamic business conditions. Organizations are installing new server technology to:
• Expand business analytics capabilities
• Make critical decisions quickly
• Accelerate time-to-results
Benefits include everything from new product discovery and design, where development processes can shrink from months to hours, to fraud detection, where real-time feedback enables operators to intercept transactions before damage is done. Entirely new go-to-market strategies are enabled by real-time mass customization. Tremendous ROI is possible using these new servers.
Figure 1. Datasets larger than the US Library of Congress* can now be accessed in less than one-millionth of a second on industry-standard servers.
Historical IT Dilemma: Expensive and Proprietary vs. Cheap and Standard – and with insufficient processing power
Years ago, a corporate mainframe or a proprietary enterprise “RISC” server were the only choices to perform large-scale business computing tasks. As the performance of industry standard server processors (also known as ‘x86′) increased and new architectures appeared, many organizations moved away from those older systems for certain applications, but retained the so-called ‘big iron’ for their most data-intensive workloads.
Software developers explored methods that allowed traditional ‘big iron’ applications to be split up and distributed among a number of smaller servers in the hopes of reducing hardware costs and vendor lock-in. Some vendors developed appliance solutions which combine special software and server hardware configurations. Some of these efforts have been successful, but they could not meet all the performance demands of large, ever-growing and more complex datasets. Simply put, they did not overcome the fundamental data-handling limits of the underlying distributed server architectures where access to data outside of memory takes thousands of times longer than in-memory references.
Figure 2. Industry standard servers now exceed the in-memory data handling capabilities of proprietary enterprise servers
Available Today: Ground-breaking Capability and Industry Standard Too
Today’s newest industry standard microprocessors have the capability to deliver the combination of performance, data-handling capacity and reliability that meet the demands of enterprise computing. Importantly, pure processor performance is not always enough – architecture that can tightly and reliably integrate these processors in numbers large enough to handle the massive amount of information that has become critical to daily business operations is also required.
Some server vendors today have an answer for this critical capability – to quickly process vast datasets in a standard ‘x86′ system running off-the-shelf software. For example, SGI Altix UV combines up to 256 Intel® Xeon® processor 7500 series processors to act like a large, single system with up to 16 Terabytes (TB) of memory for the fastest data access possible from a single server. Most notably, these industry standard servers can now exceed the capability of proprietary enterprise servers. For example, a piece of information from a data set greater than that of the entire U.S. Library of Congress can be accessed in less than 1 millionth of a second on a system such as this. This contrasts markedly with small servers or blades which typically only address up to 256 Gigabytes – only a fraction of that provided by Altix UV.
These servers run operating systems including Linux® and Windows® and off-the-shelf software. Implementation cost is decreased by utilizing software from many vendors. In addition, when industry standard hardware and software are used in combination, more applications become available and the promise of innovation increases.
Expensive, proprietary, or even highest-performing server platforms are no longer necessary when it comes to handling the massive datasets that corporations are grappling with today. By keeping costs down with industry standard microprocessors and software, these systems allow organizations to make better, faster decisions in R&D, sales, security or operations. There are now industry validated systems which offer truly advanced insight into business challenges for companies who have decision analysis requirements that outpace the ability of their existing infrastructure.
Find out more in the InterSect360 white paper “Shared Memory: Standards Scale Up with SGI Altix UV“.