Using Data Protection Guidelines to Balance Security and Compliance

When did compliance become an enemy of security? Everyone knows that although security can lead to compliance, the reverse isn’t always true. But are security and compliance really impossible to balance?

If you take a purely economic view, compliance usually takes precedence over security for the simple reason that it is easy to measure via the purely binary check box method. Therefore, huge budgets are invested in compliance efforts that could otherwise go toward improving security.

Since security is an attrition game that requires businesses to weigh effort and cost versus what risk is mitigated, spend greatly affects the level of security that can be achieved. This is even before we consider the fact that compliance typically creates a false sense of security, compliance projects are often inefficiently implemented and everybody is generally happy as long as the organization passes its audits.

What Has Changed in 2018?

Has this changed in 2018? Have we finally prioritized security over compliance?

Unfortunately, the answer is almost always no. However, new technologies are changing the economics of this problem and increasing the opportunity to achieve the same (or better) levels of compliance while spending less money. This enables businesses to free up budget to elevate security rather than simply checking boxes.

To paint a clearer picture, let’s take a look at the numbers from Gartner’s “Market Guide for Data-Centric Audit and Protection,” which is emerging as a key framework for improving data security and compliance.

Retention and Storage

Retention requirements are constantly growing. It used to be acceptable to retain data for 30 to 60 days. Now, it’s far more common for retention requirements to span at least 13 months, and many companies are in the three-year category since new regulatory requirements demand extended retention.

While these changes present considerable challenges for database administrators and other information security professionals, solutions are emerging. Technologies such as security-centric big data lakes enable customers to increase their retention period by a factor of 10 while helping to reduce storage costs by at least a factor of 10 — together, that’s 100 times the savings. Features such as deduplication, compression and columnar storage can account for many of these benefits, and the rest comes from leveraging cloud and/or on-premises object stores where the price point is extremely low for bulk data storage.

For example, 1 PB of audit data that is reduced to 200 TB of compressed/deduplicated/indexed data spread over hot/warm/cold cloud storage can cost as little as $25,000 per year. Compare that with the cost of 200 TB of enterprise storage area network (SAN) storage that is typically used to achieve online access, and it’s possible to save in excess of $1 million.

It’s important to note that these new technologies allow for data to be retained in a way that serves security and user activity analytics. It’s not dumped into some archive and never heard from again. So while the main driver is the need to be compliant, the implementation method makes it possible to improve security analytics while helping to significantly reduce costs.

User Activity and Long-Term Analytics

We know that user activity analytics is the only way to address the noise and overhead of massive raw data sets, but to be effective, the analytics must be applied to a long-term data repository. It is widely known that “stupid” algorithms that operate on a lot of data perform much better than “smart” algorithms on a few weeks’ worth of data. The optimal solution is to couple the smartest algorithms with the largest possible historical data sets.

Beyond the fact that introducing data-centric audit and protection (DCAP) user activity analytics lowers costs and improves operational efficiency, cost-effectively retaining the data for longer periods of time also results in better security analytics and allows new insights to be uncovered.

Automation and Orchestration

The DCAP world is full of review and approval processes that, until now, have been mostly manual. Findings get routed manually to business owners, who manually approve events that are then manually added to policies and reference sets.

With DCAP, people often make decisions based on a very small set of criteria, and routing and orchestration can be automated. The decision process can also be automated 95 percent of the time by building a machine learning model to ingest these criteria. This significantly reduces the operational costs and complexity of review processes, since now people only need to be involved in 5 percent of the decision process.

A Fresh Perspective on Security and Compliance

By adhering to the DCAP guidelines, organizations can reduce the cost of compliance while contributing to significant improvements in security. That way, compliance efforts need not get in the way of security, and security teams can focus more on improving the enterprise’s data protection processes and less on checking boxes to pass audits. If organizations around the world adopt this fresh perspective in 2018, compliance won’t be the enemy of security after all.

Watch the on-demand Webinar: “Enriched Agility, Retention and Insights With Guardium Big Data Intelligence”

Share this Article:
Ron Bennatan

Co-Founder at jSonar Inc.

Ron Bennatan is a co-founder at jSonar Inc. He has been a “data security guy” for 25 years and has worked at companies such as J.P. Morgan, Merrill Lynch, Intel, IBM and AT&T Bell Labs. He was co-founder and CTO at Guardium, which was acquired by IBM, where he later served as a Distinguished Engineer and the CTO for Data Security and Governance. He has a Ph.D. in Computer Science and has authored 11 technical books.