Sixth ‘v’ of big-data!

2015 proved to be the year of transformation in the data analytics market! Big-Data, which used to be just the hype or any other buzzword became the norm as more businesses realized that data, in all forms and sizes, is extremely critical in making the best possible decisions. But while everyone else was still struggling to deal with the four V’s of Big-Data i.e. Velocity, Variety, Volume and Veracity; IBM added another ‘V’ and redefined Big Data as the ability to achieve greater ‘Value’ through insights from superior analytics!


With the advanced technology, IoT devices have created an avalanche of data that is constantly accelerating its growth. 90% of today’s data has been created in just the last two years and buying and selling of this data has now become the new business bread and butter! We’ve mastered the art to extract the meaning out of this data and make useful insights but what about the ‘Vulnerability’ that comes along with this “big exposure”? As we know that the data providers and content aggregators are one of the most influential actors when it comes to cyber security the biggest question arises is what measures or policies do we have in place to prevent this from occurring of any theft or breach?

The big repositories which have terabytes of this super confidential data have become the main target of the hackers and attackers because all the crown jewels of any organization like company data, customer data, employee data and all the trade secrets reside right here. Data classification and data ownership will become very critical if these big-data breaches will go big too and by putting all this data on the cloud will not take away our responsibility of protecting it from both compliance and regulatory as well as the commercial perspective.


With the use of few key-encryption techniques and applying authorized access controls we can put some security measures in place before this big data falls short on us. It becomes a little too much on the IT overhead costs but the deployment of data on SIEM (Security incident and event management) seems to be the most popular solution for fraud detection and prevention for all the organizations.

Logs and audit trails management should be made a regular practice rather than waiting for any incident to happen and then taking any action. By doing the predictive analysis we can detect these threats at an early stage. It is also very important to integrate all the information from the physical security systems like CCTV and other building access controls in order to avoid any insider attacks or social engineering which can bring in some new risks and challenges. These risks may include any unrecognized backdoors and default credentials.​

It is crucial to detect the patterns to uncover the hidden relationships and remove these security threats. These threats can be internal or external and can come merely either from the cyber traffic or by anyone who accesses these applications, databases, mobile devices, networks inside of the enterprises. By doing the real-time streaming security analysis for the deep packet inspection we can monitor the web traffic, network flow, Domain Name System lookups and port and protocol usage. This will very precisely reveal which web servers have been infected by the malware, pinpoint the data that has been leaked, identify suspicious domain names and deliver intelligence on the data access patterns. This will help your organization to figure out which data needs to be masked, which documents to redact and which data sources to monitor.

With the malicious attacks becoming more complex and new malware being constantly developed it is high time we place security as a top priority where data is the core business function!

Previous
Previous

Future Gadgets - Gadgets that can change your world!

Next
Next

Inbox By Gmail