Applications are driving the enterprise, whether it is a relatively simple application used by millions of customers or a complex, scalable database that drives an organization's back end. These applications, and the users that count on them, expect rapid response times. In a world that demands “instant gratification,” forcing a customer, prospect, or employee to wait for a response is the kiss of death.

–George Crump, lead analyst, IT consulting firm Storage Switzerland, LLC
 

For most data centers, Crump suggests, “the number one cause of these ‘waits’ is the data storage infrastructure, and improving storage performance is a top priority for many CIOs.”

Sound familiar?

It may be challenging for executives who live well outside the IT glass house to think in milliseconds, or to recognize how much speed — which translates directly to application performance — matters. It’s tough enough to wrap our heads around the fractional advantages that accrue to Olympians like Usain Bolt and Michael Phelps, much less grasp the arcane benefits of sub-millisecond flash storage to everyday business applications.

But matter it does. Here’s a look at why, how we got here, and what organizations need to know as they tune their IT infrastructures for maximum performance — and optimum profitability. (Hint: it’s not speed for its own sake.)

I’ve got a flash for you. If you’re an owner of a small to midsize business: (paraphrasing James Carville) it’s the storage performance, dummy.

For years, the IT establishment has been telling businesses that faster performance can be achieved through more memory and greater CPU horsepower. The problem with that is simply this: at some point, you have enough processing power and memory capacity, and you’re still not satisfied with how your applications perform.

Even within the storage business, the mantra has been “large enough” storage. Only the very savvy have been recommending faster storage for the office network. Now, with the advent of inexpensive SSD/flash disks for the office, everyone can get very fast storage within a network environment. How does this translate to the data center and the cloud? Not all that well, as it turns out. Even as you add more CPU and memory to the cloud, apps are still sluggish.

The reason is that most cloud providers have opted for larger capacity — but slightly slower and less expensive — disks. High speed flash storage has become feasible in the cloud environment only recently, but even SSD drives have their limits in the cloud. There is, after all, a reason they call it “off the shelf.” This “commodity storage” offers only incremental bumps, when the smart money should be on technologies that really do deliver palpable, strategic improvements in storage performance — and therefore application speed.

The industry hasn’t done an adequate job equipping business users with the vocabulary they need to understand storage speed. The key is latency: as market researcher StorageSwiss puts it, how long it takes for a single data request to be received and the right data found and accessed from the storage media. The norm used to be storage latency of 5 milliseconds (ms) — now, it’s 1 to 2 ms latency across the board. And that’s not even fast enough. Sub-millisecond performance is just about here — and users need to start asking their providers for that kind of capability. Before long, if providers aren’t hitting sub-ms latency, they won’t even be in the ballpark.

This quantum boost in storage speed would seem to be one of those technology breakthroughs of interest mostly to those who actually make or sell the hardware, but that would be a misreading of the trend.

An industry analyst has identified high-speed flash storage as among the three biggest emerging trends in cloud computing, noting that flash storage offers greater efficiency compared to the traditional HDD storage device. “Currently, the cost of flash-based NAS storage is several times larger than that of HDD-based NAS storage,” the firm reports. “With the increased user access needs, flash-based NAS storage arrays offer 40 to 45 times better performance than hard disk input/output (I/O) performance.” New York-based 451 Research concurs, “With the inclusion of solid-state drives (SSDs) in arrays, performance is no longer a differentiator in its own right, but a scalability enabler that improves operational and financial efficiency by facilitating storage consolidation.”

The “need for speed,” then, is neither an extravagance nor a distraction; it’s an opportunity for organizations to take stock of what they have, what they’re using, and how their IT infrastructure — whether in-house or administered in the cloud, through a third party — is supporting the business.

“The problem is that the storage industry often misleads IT professionals as to where they should direct their attention when trying to eliminate wait time,” says StorageSwiss’s Crump, intimating that IT’s confusion isn’t doing business users any favors. So not only is it “the storage performance, dummy,” it’s also “the latency, dummy.”

When it comes to recognizing the importance and impact for sub-millisecond storage performance, then, there really is no time to wait.