With so many efforts focused on restoring systems, applications and workloads, it is easy to miss an important piece: the data that makes business processes possible. A fully restored system is as good as offline if you don’t have the data required to work.

Let’s face it: in the past, technology drove business capabilities. Today, data does. Weirdly, the technology is the easy part. The hard part is trying to figure out what to do with the data, our most valuable asset. We can replace ‘stuff’. Data, once stolen, corrupted or locked, not so much. 

Literature and practice in this space is not well defined. If you are looking for a standard or framework on the data lifecycle, it may be hard to come by. Instead, let’s use a mixture of a few models out there to guide us through the conversation.

 Data Creation/Tagging

Keep this old saying in mind with data: garbage in, garbage out. Sounds easy, but one would be amazed at how much garbage is out there, creating downstream impacts that are difficult to untangle. Systems of record can be an incredible trouble point if not governed and managed well

Pro tip: don’t be ‘penny-wise, pound-foolish’ on this initial first step. Spend the extra effort to get ‘clean’ data into your systems and you will have an overall more secure and resilient system. And tag it well. It makes your life easier. Also, the tech resources used to process and normalize the data will love what could be an easier workload.

Data Collection/Acquisition

Very closely related to data creation and tagging, how you collect your data is vital. Creation and collection are related (e.g., how something is created and tagged will impact how it is collected or acquired). The key is to be consistent in your approach, while allowing for some shifts over time. Data types and sizes will change with time, but these three principles are pretty safe to live by. Data can be: 

  • Acquired (something already produced, ready to be ingested)
  • Originally produced (think manual entry)
  • Captured (think about processes or devices that are creating data points that can be scooped up).

 ‘Clean’ data is vital to strengthening your cybersecurity posture.

Data Processing/Normalization

Data processing is another straightforward issue which, if done correctly, saves you a great deal of pain in the future. Think of processing and normalization as a cybersecurity basic that enables you to improve cyber hygiene across your enterprise. 

The easier you make data to use for yourself, the easier you make it for others to exploit. Employ cryptographic best practices for data in transit and data at rest throughout the life cycle.

Data Storage

At this point in the data life cycle, issues can begin to get dicey. Disaster recovery comes into play, meaning the data availability and resilience is no longer just about the data. Rather, it is about the infrastructure it relies on as well. This means different strategies for different problems to ensure that your recovery point and time objectives (crucial to your business continuity plans) can actually be met. 

Types of issues you need to consider here are: retention, backup locations and types, cyber vaulting, immutable data and time to recover, just to name a few. 

And another pro tip: if you are not testing your backups often, you are asking for trouble.

Data Management

Multiple people can now access data from multiple devices at the same time. Without some good governance behind these practices, including change management procedures, you may have a crisis scenario looming. Good data management will be able to trace changes and ensure there are protections and restrictions on who can actually access and modify the data. 

Data Usage/Analysis

This phase is where most people in the business will be involved. In this phase, users perform in-depth analysis and gain insights into the data to support the overall mission and vision. Access control is key here also, because there are so many different ways to get at the data: think data mining, artificial intelligence, machine learning or good old-fashioned human analysis. 

Whether it is an alert, engineering designs or financial information, this stage is where almost everybody has their finger in the data pie. At the same time, this stage is open to a lot of risk. You need to address human behavior here. Remember, good cybersecurity and resilience starts with the individual.

Data Archival

So, what’s the difference between storage and archival? Think about it like this: storage is where data goes to be used, backed up and protected. Archival is where data goes before it dies, but can still be pulled back from the grave if you need to. Therefore, as a best practice, do not treat storage and archive as one in the same. Your archive is where you go when all else fails or if you have some type of long-term retention needs. 

Data Destruction

 Data you no longer need might still be useful. Think about it like this: one person’s garbage can be another person’s treasure. Therefore, you should properly destroy and dispose of data you no longer use. In the wrong hands, attackers could use that data against you and threaten all security and resilience measures you have put in place.

As our organizational resilience journey starts to reach an end, there are just a couple more emerging issues worth looking at to improve your cybersecurity maturity and lower your risk profile.

More from Incident Response

What cybersecurity pros can learn from first responders

4 min read - Though they may initially seem very different, there are some compelling similarities between cybersecurity professionals and traditional first responders like police and EMTs. After all, in a world where a cyberattack on critical infrastructure could cause untold damage and harm, cyber responders must be ready for anything. But are they actually prepared? Compared to the readiness of traditional first responders, how do cybersecurity professionals in incident response stand up? Let’s dig deeper into whether the same sense of urgency exists…

X-Force uncovers global NetScaler Gateway credential harvesting campaign

6 min read - This post was made possible through the contributions of Bastien Lardy, Sebastiano Marinaccio and Ruben Castillo. In September of 2023, X-Force uncovered a campaign where attackers were exploiting the vulnerability identified in CVE-2023-3519 to attack unpatched NetScaler Gateways to insert a malicious script into the HTML content of the authentication web page to capture user credentials. The campaign is another example of increased interest from cyber criminals in credentials. The 2023 X-Force cloud threat report found that 67% of cloud-related…

Tequila OS 2.0: The first forensic Linux distribution in Latin America

3 min read - Incident response teams are stretched thin, and the threats are only intensifying. But new tools are helping bridge the gap for cybersecurity pros in Latin America. IBM Security X-Force Threat Intelligence Index 2023 found that 12% of the security incidents X-force responded to were in Latin America. In comparison, 31% were in the Asia-Pacific, followed by Europe with 28%, North America with 25% and the Middle East with 4%. In the Latin American region, Brazil had 67% of incidents that…

Alert fatigue: A 911 cyber call center that never sleeps

4 min read - Imagine running a 911 call center where the switchboard is constantly lit up with incoming calls. The initial question, “What’s your emergency, please?” aims to funnel the event to the right responder for triage and assessment. Over the course of your shift, requests could range from soft-spoken “I’m having a heart attack” pleas to “Where’s my pizza?” freak-outs eating up important resources. Now add into the mix a volume of calls that burnout kicks in and important threats are missed.…

Topic updates

Get email updates and stay ahead of the latest threats to the security landscape, thought leadership and research.
Subscribe today