Gartner’s updated Market Guide for Data Loss Prevention 2023 came out last week, and it makes it clear that data protection programs remain essential for security teams. But, they see a need for significant evolution. Let’s dive into Gartner’s key points and see how Code42 is staying a step ahead in data protection.
Understanding the evolving landscape of DLP
As I’ve developed the Code42 Incydr product strategy, I’ve seen the urgency of organizations’ need for data protection guidance in their questions about Data Loss Prevention (DLP). Gartner’s recent inquiries on this subject remain consistently high, reflecting the data loss problem’s significance, and underscoring the consistency and volume of concerns surrounding data loss – despite investment.
The initial wave of DLP was primarily about defining parameters, setting up strict boundaries and monitoring transgressions. It was an era dominated by ‘allow and deny’ policies, heavy content inspection and classification. It was also an era of complexity, endpoint performance issues and false positives. But as workplaces grew more decentralized and cloud-based, the complexity of data protection increased.
Gartner makes it clear that traditional, content-heavy, classification based DLP solutions, while still prevalent, are not catering to the dynamic data security requirements of modern organizations. So, with a persistent need for data protection that covers both the endpoint and cloud, a wide array of options developed. Now, DLP isn’t just a standalone product. It’s a capability integrated into platforms from email security to endpoint protection and security service edge platforms. DLP also isn’t complete without an underlying understanding of risk and user behavior. Gartner asserts that, “by 2027, 70% of CISOs in larger enterprises will adopt a consolidated approach to address both insider risk and data exfiltration use cases.” The need to take into account employee behavior and changing patterns of work drove evolution in DLP. It’s not simply that files containing sensitive information live in one clear location or match certain patterns; the valuable data that drives businesses exists at the perimeter and is more portable than ever.
So what’s next? To me, it sounds like DLP may well become an expected component of Insider Risk Management (IRM). But exactly how Data Loss Prevention tools address insider risk use cases still differs among vendors.
Content inspection and classification: More harm than good?
Gartner notes that traditional DLP tools face several challenges. They rely heavily on content inspection, which can be resource-intensive and cause performance issues. They also generate many false positives, leaving the data protection problem unresolved. Finding and classifying valuable, sensitive data is a major hurdle. It often limits a project’s success and consumes a lot of time. It’s like being trapped in quicksand — the harder you try to pinpoint each piece of critical data, the deeper you sink into operational challenges. This results in teams caught in an unenviable cycle: either invest excessive effort, often leading to diminishing returns, or acquiesce and accept the looming risk.
False positives are a glaring side effect: The oversensitive system might flag legitimate actions, causing business disruptions. More dangerously, unidentified assets, those slipping through classification cracks, are harder to measure. These blindspots leave organizations exposed to unexpected data loss — the kind that makes headlines. Data classification might evolve into a niche specialty, with third-party experts or tools easing the pressure on DLP systems’ built-in content inspections. Content inspection alone is not enough to accurately detect risk. It must be balanced with user-centric risk monitoring to detect all malicious and accidental data loss.
Meeting evolving needs: Robust risk detection and response without burdening teams
To meet Gartner’s requirements for the Data Loss Prevention market, we need more robust detection and prioritization methods. We also need a more effective way to control surfaced risks. Critically, this has to be done without overloading already resource-constrained security teams and without limitations on end user productivity. Code42 takes a unique approach to this challenge. We define the known risks to monitor and also surface unknown risks from day 1 through an AI-driven prioritization model that requires no policy setup.
Instead of relying on heavy content inspection, Code42’s Incydr product uses context for enhanced data protection by understanding where files originated, whether the destinations to which they are moving are trusted, who’s moving them and when. A one-size-fits-all response to data threats doesn’t work. If we can correct 80% of the uncovered risks, and they aren’t malicious, they warrant a different response than malicious data theft. Incydr supports effective prevention and remediation with a suite of controls that enable appropriate responses to both everyday mistakes and unacceptable activity. In addition to blocking, the product leverages pop-up education and alert-triggered video lessons, controls to revoke cloud sharing, and no-code automations that isolate endpoints, support conditional access, stop local sync apps, disable USB ports or lock a device. Incydr corrects user mistakes, blocks unacceptable activity and investigates & contains active insider threats.
Data Loss Prevention tools evolve quickly. We are staying a step ahead with risk detection that doesn’t require classification and heavy content inspection. And we’re making solutions that actually reduce security burden with a spectrum of automated responses. If you’d like to see more of Gartner’s guidance and how Code42 stacks up as a representative vendor, you can access the full 2023 Gartner Market Guide for Data Loss Prevention here.