IT Failing to Optimize as Big Data Grows, Productivity Loss Imminent

essidsolutions

The tsunami of big data is challenging the already-overburdened IT sector and causing costly I/O performance issues. Instead of buying new hardware, organizations can solve the problem more cost effectively with software.

While the growth of big data makes advances possible in dozens of fields, it also presents serious challenges to the already-overburdened IT sector.

It’s estimated that big data performance issues will increase IT spending by hundreds of millions of dollars over the next few years, causing problems for organizations that can’t keep up the pace.

Big Data Boom

Experts estimate that 40 zettabytes (43 trillion gigabytes) of data will be created by 2020 at a rate of 2.5 quintillion bytes per day.

To put it in perspective, that is 300 times the amount of data reportedly in circulation in 2005.

With that increase, of course, will also come a massive shift in how businesses use that data, as we are now seeing many enterprises transition from “big data” to “fast data.” In other words, the focus shifts from having huge amounts of data available to being able to use that data rapidly, accurately and for a host of applications in daily operations.

A primary reason for this data tsunami is the data collection and analysis required by the Internet of Things (IoT), through which devices—refrigerators, automobiles, home security systems, health monitors, etc.—communicate with one another and with computer systems. According to experts in the data center services industry, the next decade will be an inflection point in digitization, in which the growth of cloud providers will mushroom to meet the demand created—in part—by the IoT.

Many healthcare institutions, which are heavily impacted by both IoT and big data analytics, cannot afford the cost or lack the infrastructure to add enough physical servers to keep up with the demands. More digital tools are being brought into health IT ecosystems for both patients and clinicians to use. More medical images and more human genome sets take more storage space.

In the business sector, a major across-the-board function—marketing—has gone from being a largely analog and promotional activity to a heavily digitized means of delivering business growth. In the process, marketing has come to rival, and may soon surpass, traditional IT as a center for technology spend. A recent Gartner Group study suggests that marketing leaders now devote a portion of their expense budget to technology equal to 3.24 percent of total enterprise revenue, just barely behind the current CIO technology spend of 3.4 percent of revenue.

Social media companies, most of whose business models are centered around the sale and use of data analytics, are both major users of, and investors in, infrastructure designed to keep up with the burgeoning universe of data. It has been estimated that Facebook, for example, spends somewhere above $1.5 billion per month on hosting-related costs.

Impact on IT

An estimated 86 million U.S. workers perform jobs requiring the regular use of a computer, according to the Pew Research Center. That means degraded computer system performance due to slow processing of database applications costs time and money. A recent Condusiv Technologies survey of more than 1,400 IT experts revealed that performance issues cost each of these workers an average of fifteen minutes per day.

Let’s do the math: with the current U.S. median annual salary of $58,000, that 15 minutes per day represents an annual time loss equivalent to $1,812 per worker, or almost $156 billion across the economy.

This productivity loss will inevitably increase, but not for everyone. Companies that fail to recoup lost productivity will be at a distinct disadvantage.

Solutions

Part of the solution may be buying new hardware, but there are quicker and more cost-effective things IT system managers can do. New hardware can result in improved performance for a while, but it will inevitably become bogged down again. That’s because a primary cause of I/O bottlenecks is data pipelines, and that can’t be fixed with new hardware.

Instead, implementing software that reduces input and output can improve performance dramatically by doubling the I/O capability of storage and servers, including SQL servers.

Bottom line: Look at software first to solve your big data performance issues instead of buying new hardware. Solutions exist that can handle storage performance at the operating system, file system and application levels.