With so many efforts focused on restoring systems, applications and workloads, it is easy to miss an important piece: the data that makes business processes possible. A fully restored system is as good as offline if you don’t have the data required to work.

Let’s face it: in the past, technology drove business capabilities. Today, data does. Weirdly, the technology is the easy part. The hard part is trying to figure out what to do with the data, our most valuable asset. We can replace ‘stuff’. Data, once stolen, corrupted or locked, not so much. 

Literature and practice in this space is not well defined. If you are looking for a standard or framework on the data lifecycle, it may be hard to come by. Instead, let’s use a mixture of a few models out there to guide us through the conversation.

 Data Creation/Tagging

Keep this old saying in mind with data: garbage in, garbage out. Sounds easy, but one would be amazed at how much garbage is out there, creating downstream impacts that are difficult to untangle. Systems of record can be an incredible trouble point if not governed and managed well

Pro tip: don’t be ‘penny-wise, pound-foolish’ on this initial first step. Spend the extra effort to get ‘clean’ data into your systems and you will have an overall more secure and resilient system. And tag it well. It makes your life easier. Also, the tech resources used to process and normalize the data will love what could be an easier workload.

Data Collection/Acquisition

Very closely related to data creation and tagging, how you collect your data is vital. Creation and collection are related (e.g., how something is created and tagged will impact how it is collected or acquired). The key is to be consistent in your approach, while allowing for some shifts over time. Data types and sizes will change with time, but these three principles are pretty safe to live by. Data can be: 

  • Acquired (something already produced, ready to be ingested)
  • Originally produced (think manual entry)
  • Captured (think about processes or devices that are creating data points that can be scooped up).

 ‘Clean’ data is vital to strengthening your cybersecurity posture.

Data Processing/Normalization

Data processing is another straightforward issue which, if done correctly, saves you a great deal of pain in the future. Think of processing and normalization as a cybersecurity basic that enables you to improve cyber hygiene across your enterprise. 

The easier you make data to use for yourself, the easier you make it for others to exploit. Employ cryptographic best practices for data in transit and data at rest throughout the life cycle.

Data Storage

At this point in the data life cycle, issues can begin to get dicey. Disaster recovery comes into play, meaning the data availability and resilience is no longer just about the data. Rather, it is about the infrastructure it relies on as well. This means different strategies for different problems to ensure that your recovery point and time objectives (crucial to your business continuity plans) can actually be met. 

Types of issues you need to consider here are: retention, backup locations and types, cyber vaulting, immutable data and time to recover, just to name a few. 

And another pro tip: if you are not testing your backups often, you are asking for trouble.

Data Management

Multiple people can now access data from multiple devices at the same time. Without some good governance behind these practices, including change management procedures, you may have a crisis scenario looming. Good data management will be able to trace changes and ensure there are protections and restrictions on who can actually access and modify the data. 

Data Usage/Analysis

This phase is where most people in the business will be involved. In this phase, users perform in-depth analysis and gain insights into the data to support the overall mission and vision. Access control is key here also, because there are so many different ways to get at the data: think data mining, artificial intelligence, machine learning or good old-fashioned human analysis. 

Whether it is an alert, engineering designs or financial information, this stage is where almost everybody has their finger in the data pie. At the same time, this stage is open to a lot of risk. You need to address human behavior here. Remember, good cybersecurity and resilience starts with the individual.

Data Archival

So, what’s the difference between storage and archival? Think about it like this: storage is where data goes to be used, backed up and protected. Archival is where data goes before it dies, but can still be pulled back from the grave if you need to. Therefore, as a best practice, do not treat storage and archive as one in the same. Your archive is where you go when all else fails or if you have some type of long-term retention needs. 

Data Destruction

 Data you no longer need might still be useful. Think about it like this: one person’s garbage can be another person’s treasure. Therefore, you should properly destroy and dispose of data you no longer use. In the wrong hands, attackers could use that data against you and threaten all security and resilience measures you have put in place.

As our organizational resilience journey starts to reach an end, there are just a couple more emerging issues worth looking at to improve your cybersecurity maturity and lower your risk profile.

More from Incident Response

3 recommendations for adopting generative AI for cyber defense

3 min read - In the past eighteen months, generative AI (gen AI) has gone from being the source of jaw-dropping demos to a top strategic priority in nearly every industry. A majority of CEOs report feeling under pressure to invest in gen AI. Product teams are now scrambling to build gen AI into their solutions and services. The EU and US are beginning to put new regulatory frameworks in place to manage AI risks.Amid all this commotion, hackers and other cybercriminals are hardly…

What we can learn from the best collegiate cyber defenders

3 min read - This year marked the 19th season of the National Collegiate Cyber Defense Competition (NCCDC). For those unfamiliar, CCDC is a competition that puts student teams in charge of managing IT for a fictitious company as the network is undergoing a fundamental transformation. This year the challenge involved a common scenario: a merger. Ten finalist teams were tasked with managing IT infrastructure during this migrational period and, as an added bonus, the networks were simultaneously attacked by a group of red…

Why security orchestration, automation and response (SOAR) is fundamental to a security platform

3 min read - Security teams today are facing increased challenges due to the remote and hybrid workforce expansion in the wake of COVID-19. Teams that were already struggling with too many tools and too much data are finding it even more difficult to collaborate and communicate as employees have moved to a virtual security operations center (SOC) model while addressing an increasing number of threats.  Disconnected teams accelerate the need for an open and connected platform approach to security . Adopting this type of…

Topic updates

Get email updates and stay ahead of the latest threats to the security landscape, thought leadership and research.
Subscribe today