Enterprises across industry sectors are swamped with data but hungry for insights. Retailers are seeking intelligence from data on customers’ shopping habits, web browsing patterns, and interactions with service staff to deliver faster and more personal customer experiences. Healthcare organizations are mining patient cases, trial results and research studies to develop medical breakthroughs. Governments are studying e-services usage and patterns to better serve citizens efficiently and cost-effectively.
These are but a few examples of how organizations are operating in the new world of big data. Indeed, the amount of data that corporations are required to store is doubling annually. Market research firm IDC Corp. estimates that digital data will grow to 2.75 zettabytes in 2012 and the volume will reach close to 8 zettabytes by 2015. (One zettabyte is equivalent to 273 trillion songs on an iPod.)
The need for enterprises to extrapolate intelligence from burgeoning data is evident. The challenge is how they approach data management and IT infrastructure.
Organizations’ ability to collect data has never been better. However, to cope with the amount of data generated, many enterprises have been investing significant resources to enable data traffic and storage. Additional investment is then made to organize and analyze the information in these data marts.
Not surprisingly, this conventional two-step approach to data management is no longer adequate in today’s big data environment. Companies cannot indefinitely expand their traditional storage and pipelines to meet data growth. More importantly, by the time the analysis from stock data – information stored in data marts – is ready, abundant new data has emerged and the intelligence may no longer be relevant for companies to respond to market changes.
In other words, the big data reality is pressing organizations for a more agile data infrastructure to stay responsive and competitive with limited resources. To do so, organizations need to reexamine the three fundamentals, or the ABC of their data infrastructure – Analytics, Bandwidth and Content.
Analytics – instead of focusing on stock data, what is the organization’s current ability to analyze data flow in real-time or near real-time? Is metadata sufficiently leveraged so that the information can be indexed, sorted and prioritized according to relevance and can be retrieved on-demand?
Bandwidth – as video and audio data becomes increasingly abundant and important (consider the amount of YouTube videos, podcasts and video surveillance materials as examples), what is the organization’s ability to scale their provision for disseminating and storing the files and to extract information such as sensor data from the files for analysis?
Content – aside from the perennial capacity challenge, how is the organization ensuring adequate security to protect sensitive information? How is structured and unstructured data managed accordingly? What workflows can be automated to mitigate human error and reduce staffing? What open source and cloud-based solutions can be deployed to provide scalable solutions to combat growing demand and cost for storage?
Forward-thinking organizations are readily embracing agile data infrastructure to meet their IT growth. To stay ahead of the competition, our customers tell us that their businesses need to be efficient – ultimately it comes down to profit and margin. And they need to ramp existing product and service lines as well as develop new opportunities. They need to grow revenue, sustainably with greater productivity, margins and profit.
IT often finds itself in the position of not being able to deliver against these imperatives because of the complexity, cost, and scale of decades-old approaches. For example, designing infrastructure one project at a time, customized to each new application, multiplies the cost and complexity of maintaining today’s systems while leaving the company exposed to the unknown requirements of tomorrow.
So, this imperative to consistently manage IT at scale, and use it to accelerate business forward becomes more urgent.
With this magnitude of data growth, the question most IT groups are asking is “will my infrastructure support this type of growth?”
A business inflection point is defined as an event that results in a significant change in the progress of a company, industry, or sector. An inflection point can be considered a turning point after which a dramatic change, with either positive or negative results, is expected to result. Andy Grove, Intel’s co-founder, described an inflection point as “an event that changes the way we think and act.”
Continuous data growth can eventually lead to an inflection point for your business. If you are struggling to keep up with data growth and cannot provide the data services that your company needs, you may have already passed the inflection point.
On the other hand, we believe that if you have an agile data infrastructure that provides intelligent, immortal, and infinite attributes then you are in a position to help your company use data as a propellant to success.
As enterprises confront the new big data reality, those who have an intelligent, scalable and agile data infrastructure are poised to uncover the intelligence they need to stay competitive and efficient.
Jeff Goldstein is general manager of Canada for NetApp.