There’s data. And then there’s BIG DATA.
Many of us have been bombarded with the term in many frameworks. There are some professionals that chalk it up to marketing hype or meaningless buzzword.
Personally, I prefer the way Gartner categorises it. That it is more than size. It is a multi-dimensional model that includes complexity, variety, velocity and, yes, volume.
But the pressing issue with this definition of Big Data is how best to secure something so vast and multifaceted. If you recognise the old concept of a network perimeter is antiquated and dangerously narrow, there should be some concern as to corralling all this data and ensuring its transit and storage is protected.
The latter issue speaks directly to compliance needs. Banks and other financial institutions, medical facilities, insurance, retailers and government entities are especially sensitive to the compliance requirements.
However, if your business doesn’t fit into these verticals it doesn’t mean you can’t directly benefit from cloud based security that creates the necessary context.
And though your organisation is dealing with an incredible mountain of data, you still must do what you can to ensure not only the proprietary intelligence behind your firewalls, but all the data trafficking in, around and through all various endpoints throughout the enterprise.
But again, size should not be the only consideration regarding Big Data. It is the means by which you analyse and apply various processes that allow you to make the best decisions possible about the ongoing security, accessibility and viability of all those many bits and bytes.
If you are looking at scale, the McKinsey Global Institute estimates that “enterprises globally stored more than seven exabytes of new data on disk drives in 2010.” One exabyte of data is the equivalent of more than 4,000 times the information stored in the US Library of Congress. That’s a lot of data.
Storing is one thing, but analysing and managing all the data into useful strategic and tactical outcomes now depends on the other elements of Big Data (complexity, variety, velocity). To do this successfully you have to have a means to put all of it into context.
For instance, let’s say an account is accessed. It has the right user name/password credentialing and seeks to export some personal data or transfer funds, or change sensitive account settings. On its face you should allow this action. They have the right name and authentication.
But when this is given greater context, there are dynamics from other silos of information that need to be factored. What is the device profile? URL reputation? Is the IP address consistent? When was last log in attempt? What time did this latest transaction occur? So, what seemed to be a reasonable transaction might shows patterns of anomalous behaviour.
The bottom line is Big Data can be managed given the right tools. And those tools do exist in the cloud and can be managed through the same. And when you have the right rules, passing though an integrated suite of security solutions you’ll begin to see that size doesn’t matter.