Who is responsible for determining who can access sensitive information? Is it the role of the database or system administrator, or the data owners from lines of business (LOBs)? Maybe the permission oversight varies when data content includes sensitive information. Should your privileged users and admins have actual access to the content? If so, how much control to you have over preventing bad behavior?

Fighting Alert Fatigue

Organizations typically rely on volumes of logs to forensically identify who accessed what data at what time and assess whether the access was appropriate or constituted a policy violation. Administrators may consider flowing the database or data access logs to the organization’s security information and event management (SIEM) solution to correlate and assist in determining policy violations. The problem is that large volumes of logs collected and evaluated by the SIEM cause significant overhead and performance degradation, and require extensive human oversight to achieve. Analysts tasked with quickly reviewing these massive logs tend to become desensitized since many alerts end up being false positives or otherwise irrelevant. Unfortunately, this means real risks are often overlooked.

An effective approach to this challenge is to front-end the information landscape — including databases, mainframe data and files — and move the analysis overhead away from the critical systems. A database, for example, is considered structured data since contents are stored in structured tables, columns and rows. When calls to the data are evaluated off of the critical systems themselves, there is an opportunity for real-time evaluation based on appropriate permissions to block, redact and mask content before disseminating it.

It’s also possible to leverage out-of-the box governance frameworks. Data privacy requires knowledge of who is accessing data, when, whether it’s appropriate and whether sensitive information was accessed. Many governance controls also determine the number of failed logins and whether these attempts are eventually successful.

Controlling Access to Sensitive Information in Real Time

By conducting this monitoring seamlessly outside of the actual database server or system, security teams can eliminate the overhead and let the databases, data repositories and SIEM tools to do what they do best. In fact, these systems can synchronously scan and monitor the entire IT landscape and categorize information according to policies. These methods easily facilitate outgoing data according to controls and may even terminate connections that attempt to violate policies.

Best of all, this is relatively easy to incorporate, given the right tools. Solutions that include comprehensive out of-the-box governance models are already equipped to look in the right areas, and groups of users with varying levels of access permissions can be imported from the actual databases groups or from external files and data structures. These groups can then be quickly aligned with the controlled data classifications and granted appropriate access and permissions. As for unstructured data on these servers, advanced data security solutions can perform the same monitoring and provide real-time controls to protect sensitive information.

The bottom line is that it’s crucial to understand where the organization stores its data, who is accessing it and whether that access aligns with established security policies. Without this visibility, threats are bound to slip past the weary eyes of overworked security analysts, and sensitive data is bound to slip into the wrong hands.

Listen to the podcast: Data Risk Management in 2018 — What to Look for and How to Prepare

More from Data Protection

How data residency impacts security and compliance

3 min read - Every piece of your organization’s data is stored in a physical location. Even data stored in a cloud environment lives in a physical location on the virtual server. However, the data may not be in the location you expect, especially if your company uses multiple cloud providers. The data you are trying to protect may be stored literally across the world from where you sit right now or even in multiple locations at the same time. And if you don’t…

From federation to fabric: IAM’s evolution

15 min read - In the modern day, we’ve come to expect that our various applications can share our identity information with one another. Most of our core systems federate seamlessly and bi-directionally. This means that you can quite easily register and log in to a given service with the user account from another service or even invert that process (technically possible, not always advisable). But what is the next step in our evolution towards greater interoperability between our applications, services and systems?Identity and…

The compelling need for cloud-native data protection

4 min read - Cloud environments were frequent targets for cyber attackers in 2023. Eighty-two percent of breaches that involved data stored in the cloud were in public, private or multi-cloud environments. Attackers gained the most access to multi-cloud environments, with 39% of breaches spanning multi-cloud environments because of the more complicated security issues. The cost of these cloud breaches totaled $4.75 million, higher than the average cost of $4.45 million for all data breaches.The reason for this high cost is not only the…

Topic updates

Get email updates and stay ahead of the latest threats to the security landscape, thought leadership and research.
Subscribe today