November 2, 2015 By Vikalp Paliwal 3 min read

Expectation: I am an IT security and compliance director, and I have an audit coming in a month. I have few databases that have customer data, product data and employee data. I think I will be OK and may be able to pass the audit if everything goes well.

Reality: I am an IT security and compliance director and I have an audit coming in a month. I have databases that I don’t even know about in my environment, and I am not sure if those databases contain customer data, product data, employee data or some other sensitive data. I don’t think I’ll be OK, and I will fail the audit. I am sure nothing will go well.

Is Your Compliance Status at Risk?

Does this reality sound all too familiar? And more importantly, are you covered?

Compliance is most challenging when you have to meet various security baselines (e.g., STIG, CVE, CIS, SOX, PCI or HIPAA) for all your data sources. Multiply it by the number of data sources across several platforms and the equation gets very complex.

Read the white paper: Three Guiding Principles to Improve Data Security and Compliance

Organizations are almost constantly trying to minimize the cost of compliance and are always looking for a way to meet all their compliance needs cost-effectively and without much hassle. This sounds too perfect, right?

But it is possible. It starts by understanding the strategy to be in compliance from a data source vulnerabilities perspective and adopt industry best practices in this approach.

Why Do You Need This Approach to Compliance?

Organizations have thousands of data sources across multiple platforms (e.g., databases, data warehouses, big data platforms) with thousands of users accessing data from applications or natively using sequel queries. Database administrators (DBAs) deploy the production system with default usernames and passwords or passwords that do not comply as per corporate baseline. As a result, they are easy to decode for gaining access to the data.

Data control language (DCL) commands like Grant and Revoke are not set properly and often provide excessive privileges to users who do not need them. There are so many misconfigurations and default database settings that need to be changed. There are also missing patches that need to be updated.

Managing the data repositories’ vulnerabilities requires a great deal of skill and maintenance, especially when you multiply this by the number of data sources and data platforms. It’s a big project, and relying on resolving this all manually is a massive investment in time and resources. But it will not make you sufficiently compliant or more secure on its own.

This leads to the next point: Insider threats increasingly lead to data breaches. Enterprises often have no authority to restrict access, they do not have a simple source of information on who has access to what and they cannot manage entitlements across a large inventory of data sources. All of this must be remediated.

Is There a Solution?

Enterprises are looking for an automated way to analyze sensitive data servers, identify sensitive data, check for the risk posture in those servers and get a vulnerability assessment report to understand the overall risk. Once they gain access to this information, they can then look for best practices to remediate all vulnerabilities scanned on those servers.

This will solve the twofold problem of managing compliance and securing your data.

That may be your ideal scenario, but it can also be a reality. Some organizations are leveraging IBM Security Guardium Vulnerability Assessment to solve this big challenge. To learn more, watch the on-demand webinar “Avoiding the Data Compliance Hot Seat” or watch the latest Guardium Vulnerability Assessment demo:

https://youtu.be/i60ht6UF27s

More from Data Protection

Why safeguarding sensitive data is so crucial

4 min read - A data breach at virtual medical provider Confidant Health lays bare the vast difference between personally identifiable information (PII) on the one hand and sensitive data on the other.The story began when security researcher Jeremiah Fowler discovered an unsecured database containing 5.3 terabytes of exposed data linked to Confidant Health. The company provides addiction recovery help and mental health treatment in Connecticut, Florida, Texas and other states.The breach, first reported by WIRED, involved PII, such as patient names and addresses,…

Addressing growing concerns about cybersecurity in manufacturing

4 min read - Manufacturing has become increasingly reliant on modern technology, including industrial control systems (ICS), Internet of Things (IoT) devices and operational technology (OT). While these innovations boost productivity and streamline operations, they’ve vastly expanded the cyberattack surface.According to the 2024 IBM Cost of a Data Breach report, the average total cost of a data breach in the industrial sector was $5.56 million. This reflects an 18% increase for the sector compared to 2023.Apparently, the data being stored in industrial control systems is…

3 proven use cases for AI in preventative cybersecurity

3 min read - IBM’s Cost of a Data Breach Report 2024 highlights a ground-breaking finding: The application of AI-powered automation in prevention has saved organizations an average of $2.2 million.Enterprises have been using AI for years in detection, investigation and response. However, as attack surfaces expand, security leaders must adopt a more proactive stance.Here are three ways how AI is helping to make that possible:1. Attack surface management: Proactive defense with AIIncreased complexity and interconnectedness are a growing headache for security teams, and…

Topic updates

Get email updates and stay ahead of the latest threats to the security landscape, thought leadership and research.
Subscribe today