Big data, heavy data & DCAP

Big data, heavy data & DCAP

With information volumes surging and compliance burdens growing, enterprises need a new approach to data security

 

By 2025 humans will have generated 180 zettabytes of data [i]– or 180 trillion gigabytes. That’s a lot, and well up from the mere 10 zettabytes we’d created back in 2015.

Managing and protecting all that information has become a significant challenge. Surging data volumes include important business information that’s often stored in multiple repositories: different geographies, business units, departments, formats; and housed both on-premises and in the cloud.

Draconian regulations

Regulatory regimes like the EU’s General Data Protection Regulation (GDPR) and the coming California Consumer Privacy Act (CCPA) pile on the pressure by compelling organisations to rigorously catalog all the personally identifiable information they have on file and monitor it continuously for potential breaches.

Because of this companies have to submit to more data audits and assessments than ever before. Many struggle to provide the necessary proof that they have full visibility of critical data with appropriate protections in place.

There is also a trend to utilize greater and greater amounts of information from outside sources. Categorizing all third-party data and understanding its sensitivity before it enters company systems is critical.

With data so dispersed, repositories overflowing, audits and regulatory scrutiny on the rise, traditional methods of managing data security are quickly becoming unstuck.

Last year there were more than 6,500 publicly disclosed incidents of data loss affecting more than 5 billion records. Despite large and ongoing investment in cybersecurity, data assets are still vulnerable. But many companies today simply couldn’t even tell you where all their sensitive data is – particularly when it sits in unstructured formats or is housed in relational databases, data warehouse hardware, or big data sources located in-house and up in the cloud.

That lack of data-centric protection increases the risk of data breaches and compliance failures – both of which can extract a painful cost.

A better way to protect data

Organisations in every sector depend on the security, accuracy, and availability of their data to generate revenue. Data helps businesses better serve customers, boost productivity, understand the drivers behind business outcomes and plan for the future.

So data is now business critical. But if you don’t know where your prize assets are, you can’t protect them. With data breaches on the rise, every organization needs to look again at its approach to risk mitigation and verify that they have the tools and processes in place to deliver these core data protection capabilities:

  • Locating and classifying sensitive data regardless of where it sits in the organization.
  • Applying protection mechanisms to valuable and personally identifiable data in order to mitigate breaches.
  • Sustaining compliance with current data security and privacy regulations, including the ability to monitor data, user behavior, and report adverse behavior quickly.
  • Visualizing capabilities that allow users across the business to conduct analytics.
  • Reporting capabilities that offer robust audit readiness.

The rise of DCAP

Gartner Research predicts[ii] that by 2020, 40 per cent of enterprises will have replaced the disparate and siloed data security tools currently in use with data-centric audit and protection (DCAP) products. DCAP tools deliver a centralized view of all at-risk data, allowing organizations to track their sensitive data and protect it in line with regulatory and risk management requirements.

DCAP is all about encouraging IT teams to focus on data, not the underlying information technology. While other data security approaches keep IT departments forever chasing potential threats, DCAP focuses on viewing, monitoring, and managing how users interact with high-risk data sets.

IT maintains responsibility for the installation, configuration and management of business apps and network infrastructure, but leaves responsibility for data with the people who really understand its value: the owners – those that created it, their departments, and their managers.

The five pillars

It’s five pillars enable companies to get to grips with their most sensitive information:

  • Classifying data across the IT estate, and implementing policies that can categorize files as they are created.
  • Controlling data and / or privileges from access to editing to blocking, with unique profiles for specialist users like system administrators and developers.
  • Reporting of user activity to detect suspicious behavior.
  • Tracking and Controlling security events as they occur, helping organizations to better understand where vulnerabilities may be hiding.
  • Centralizing data management through a dashboard that enables administrators to apply security policies quickly across the network.

While achieving all five can seem like a monumental challenge, the technology needed to execute the DCAP model is now well within reach – even for large enterprises. Companies are now taking advantage of the next generation of analytics tools to better understand where their data crown jewels reside, track who’s accessing them, and apply appropriate levels of protection.

With GTB’s built-in machine learning and analytics, organizations can achieve a greater level of data insight and confidence that they are in compliance. They can also control how files are shared on popular collaboration and enterprise content management platforms like SharePoint.

GTB’s Data Protection that Workstm platform provides companies with a platform to implement DCAP across the entire enterprise. By streamlining data protection processes with artificially intelligent algorithms, GTB delivers the highest in data security assurance without impacting smooth business operations.

 

[i] 2016 IoT Midyear Review – The Report Card for Everyone, IDC, August 4, 2016.

[ii] Market Guide for Data-Centric Audit and Protection,” DCAP Gartner Research, March 21, 2017

adroll_adv_id = “UIOFH72HVBDSPBBLAJUZE6”;
adroll_pix_id = “HNO2CUNA4BAINCHLEPH2JH”;
/* OPTIONAL: provide email to improve user identification */
/* adroll_email = “username@example.com”; */
(function () {
var _onload = function(){
if (document.readyState && !/loaded|complete/.test(document.readyState)){setTimeout(_onload, 10);return}
if (!window.__adroll_loaded){__adroll_loaded=true;setTimeout(_onload, 50);return}
var scr = document.createElement(“script”);
var host = ((“https:” == document.location.protocol) ? “https://s.adroll.com” : “http://a.adroll.com”);
scr.setAttribute(‘async’, ‘true’);
scr.type = “text/javascript”;
scr.src = host + “/j/roundtrip.js”;
((document.getElementsByTagName(‘head’) || [null])[0] ||
document.getElementsByTagName(‘script’)[0].parentNode).appendChild(scr);
};
if (window.addEventListener) {window.addEventListener(‘load’, _onload, false);}
else {window.attachEvent(‘onload’, _onload)}
}());

Visibility: Accurately, discover sensitive data; detect and address broken business process, or insider threats including sensitive data breach attempts.

Protection: Automate data protection, breach prevention and incident response both on and off the network; for example, find and quarantine sensitive data within files exposed on user workstations, FileShares and cloud storage.

Notification: Alert and educate users on violations to raise awareness and educate the end user about cybersecurity and corporate policies.

Education: Start target cyber-security training; e.g., identify end-users violating policies and train them.

  • Employees and organizations have knowledge and control of the information leaving the organization, where it is being sent, and where it is being preserved.
  • Ability to allow user classification to give them influence in how the data they produce is controlled, which increases protection and end-user adoption.
  • Control your data across your entire domain in one Central Management Dashboard with Universal policies.
  • Many levels of control together with the ability to warn end-users of possible non-compliant – risky activities, protecting from malicious insiders and human error.
  • Full data discovery collection detects sensitive data anywhere it is stored, and provides strong classification, watermarking, and other controls.
  • Delivers full technical controls on who can copy what data, to what devices, what can be printed, and/or watermarked.
  • Integrate with GRC workflows.
  • Reduce the risk of fines and non-compliance.
  • Protect intellectual property and corporate assets.
  • Ensure compliance within industry, regulatory, and corporate policy.
  • Ability to enforce boundaries and control what types of sensitive information can flow where.
  • Control data flow to third parties and between business units.