With so many efforts focused on restoring systems, applications and workloads, it is easy to miss an important piece: the data that makes business processes possible. A fully restored system is as good as offline if you don’t have the data required to work.
Let’s face it: in the past, technology drove business capabilities. Today, data does. Weirdly, the technology is the easy part. The hard part is trying to figure out what to do with the data, our most valuable asset. We can replace ‘stuff’. Data, once stolen, corrupted or locked, not so much.
Literature and practice in this space is not well defined. If you are looking for a standard or framework on the data lifecycle, it may be hard to come by. Instead, let’s use a mixture of a few models out there to guide us through the conversation.
Keep this old saying in mind with data: garbage in, garbage out. Sounds easy, but one would be amazed at how much garbage is out there, creating downstream impacts that are difficult to untangle. Systems of record can be an incredible trouble point if not governed and managed well.
Pro tip: don’t be ‘penny-wise, pound-foolish’ on this initial first step. Spend the extra effort to get ‘clean’ data into your systems and you will have an overall more secure and resilient system. And tag it well. It makes your life easier. Also, the tech resources used to process and normalize the data will love what could be an easier workload.
Very closely related to data creation and tagging, how you collect your data is vital. Creation and collection are related (e.g., how something is created and tagged will impact how it is collected or acquired). The key is to be consistent in your approach, while allowing for some shifts over time. Data types and sizes will change with time, but these three principles are pretty safe to live by. Data can be:
- Acquired (something already produced, ready to be ingested)
- Originally produced (think manual entry)
- Captured (think about processes or devices that are creating data points that can be scooped up).
‘Clean’ data is vital to strengthening your cybersecurity posture.
Data processing is another straightforward issue which, if done correctly, saves you a great deal of pain in the future. Think of processing and normalization as a cybersecurity basic that enables you to improve cyber hygiene across your enterprise.
The easier you make data to use for yourself, the easier you make it for others to exploit. Employ cryptographic best practices for data in transit and data at rest throughout the life cycle.
At this point in the data life cycle, issues can begin to get dicey. Disaster recovery comes into play, meaning the data availability and resilience is no longer just about the data. Rather, it is about the infrastructure it relies on as well. This means different strategies for different problems to ensure that your recovery point and time objectives (crucial to your business continuity plans) can actually be met.
Types of issues you need to consider here are: retention, backup locations and types, cyber vaulting, immutable data and time to recover, just to name a few.
And another pro tip: if you are not testing your backups often, you are asking for trouble.
Multiple people can now access data from multiple devices at the same time. Without some good governance behind these practices, including change management procedures, you may have a crisis scenario looming. Good data management will be able to trace changes and ensure there are protections and restrictions on who can actually access and modify the data.
This phase is where most people in the business will be involved. In this phase, users perform in-depth analysis and gain insights into the data to support the overall mission and vision. Access control is key here also, because there are so many different ways to get at the data: think data mining, artificial intelligence, machine learning or good old-fashioned human analysis.
Whether it is an alert, engineering designs or financial information, this stage is where almost everybody has their finger in the data pie. At the same time, this stage is open to a lot of risk. You need to address human behavior here. Remember, good cybersecurity and resilience starts with the individual.
So, what’s the difference between storage and archival? Think about it like this: storage is where data goes to be used, backed up and protected. Archival is where data goes before it dies, but can still be pulled back from the grave if you need to. Therefore, as a best practice, do not treat storage and archive as one in the same. Your archive is where you go when all else fails or if you have some type of long-term retention needs.
Data you no longer need might still be useful. Think about it like this: one person’s garbage can be another person’s treasure. Therefore, you should properly destroy and dispose of data you no longer use. In the wrong hands, attackers could use that data against you and threaten all security and resilience measures you have put in place.
As our organizational resilience journey starts to reach an end, there are just a couple more emerging issues worth looking at to improve your cybersecurity maturity and lower your risk profile.