Gartner Analyst and It’s Time to Redefine Data Loss Prevention

A Gartner Analyst and “It’s Time to Redefine Data Loss Prevention”

 

Today, it seems to be in vogue to criticize DLP solutions as out of date, insufficient for modern business needs, and generally out of touch with industry realities.

 

One of the more notable sources to voice this opinion has been none other than industry leader Gartner.

 

In an analysis piece entitled “It’s Time to Redefine Data Loss Prevention” [1] Gartner goes after the most dominant trends in DLP.   The article asserts that security and risk management leaders need to shift from current trends in data loss protection and “implement a holistic data security governance strategy.”  This is the only way for IT departments to insure “data protection throughout the information life cycle.”

 

The Gartner write up lays out a nuanced, but ultimately damning case against contemporary DLP.   Note that GTB Technologies customer’s were not part of the analysis as the report appears to be about “Gartner Market Leaders”.

 

The summary of their argument looks something like this:

Despite a market awash in DLP solution options, organizations are still struggling with communication between data owners and those responsible for administering DLP systems.  A symptom of this disconnect is that managers are opting for programs that will automate the work of DLP. This has resulted according to Gartner in “technology-driven — rather than business-driven — implementations.”

 

Another problem says Gartner is that many DLP solution users struggle to get out of the initial phases of discovering and monitoring data flows after the platform is first deployed. The focus on these meticulous tasks means that organizations never realize the potential benefits of “deeper data analytics” or “applying appropriate data protections.”

 

Lastly, the article points out that DLP as a technology is viewed by users–whether they be individuals or enterprises–as a “high-maintenance tool”, requiring constant attention and a substantial regular investment of man hours.   This ultimately leads to “incomplete deployments” in relation to the systems actual DLP needs.   As a result of all of these phenomenon, says Gartner, companies end up being stuck with systems that require constant fine tuning, and struggle to calculate the ROI on the substantial investments in DLP platforms.

 

While all of the above points are fair criticisms of contemporary DLP, the approach offered up in the analysis to solve these problems are totally off the mark.   Gartner suggests a total shift in data loss management, moving away from reliance on technology, and instead “sharing responsibility” for DLP between the different constituents in an organization. To achieve better DLP, the industry does not need to run away from technology, but rather incorporate programs that will address the very real problems Gartner has laid out.

 

GTB’s Smart DLP that WorksTM is a platform designed to do just that.

Using patented artificial intelligence models, the GTB data loss prevention programs use an artificial intelligence based approach to manage sensitive data. This allows the platform to learn and map the network, freeing IT from the tedious maintenance attached to other solutions.    Due to the precision of it’s detection technology, ease of use and quick time to value,  Smart DLP allows processes to be streamlined, instead of bogging down administrators with errors and false positives.

With Smart DLP managers can have their cake and eat it too.   GTB reminds users that security does not come at the expense of efficiency.

[1] It’s Time to Redefine Data Loss Prevention Published 19 September 2017 – ID G00333194  Gartner

 

Where is the Gartner Magic Quadrant for Enterprise Data Loss Prevention (DLP) 2020?

 

adroll_adv_id = “UIOFH72HVBDSPBBLAJUZE6”;
adroll_pix_id = “HNO2CUNA4BAINCHLEPH2JH”;
/* OPTIONAL: provide email to improve user identification */
/* adroll_email = “username@example.com”; */
(function () {
var _onload = function(){
if (document.readyState && !/loaded|complete/.test(document.readyState)){setTimeout(_onload, 10);return}
if (!window.__adroll_loaded){__adroll_loaded=true;setTimeout(_onload, 50);return}
var scr = document.createElement(“script”);
var host = ((“https:” == document.location.protocol) ? “https://s.adroll.com” : “http://a.adroll.com”);
scr.setAttribute(‘async’, ‘true’);
scr.type = “text/javascript”;
scr.src = host + “/j/roundtrip.js”;
((document.getElementsByTagName(‘head’) || [null])[0] ||
document.getElementsByTagName(‘script’)[0].parentNode).appendChild(scr);
};
if (window.addEventListener) {window.addEventListener(‘load’, _onload, false);}
else {window.attachEvent(‘onload’, _onload)}
}());

Visibility: Accurately, discover sensitive data; detect and address broken business process, or insider threats including sensitive data breach attempts.

Protection: Automate data protection, breach prevention and incident response both on and off the network; for example, find and quarantine sensitive data within files exposed on user workstations, FileShares and cloud storage.

Notification: Alert and educate users on violations to raise awareness and educate the end user about cybersecurity and corporate policies.

Education: Start target cyber-security training; e.g., identify end-users violating policies and train them.

  • Employees and organizations have knowledge and control of the information leaving the organization, where it is being sent, and where it is being preserved.
  • Ability to allow user classification to give them influence in how the data they produce is controlled, which increases protection and end-user adoption.
  • Control your data across your entire domain in one Central Management Dashboard with Universal policies.
  • Many levels of control together with the ability to warn end-users of possible non-compliant – risky activities, protecting from malicious insiders and human error.
  • Full data discovery collection detects sensitive data anywhere it is stored, and provides strong classification, watermarking, and other controls.
  • Delivers full technical controls on who can copy what data, to what devices, what can be printed, and/or watermarked.
  • Integrate with GRC workflows.
  • Reduce the risk of fines and non-compliance.
  • Protect intellectual property and corporate assets.
  • Ensure compliance within industry, regulatory, and corporate policy.
  • Ability to enforce boundaries and control what types of sensitive information can flow where.
  • Control data flow to third parties and between business units.