Lessons From the Cold War: Data Security Through Obscurity

During a long-term project in 2016, my team was tasked with reviewing the security features of a popular Soviet-era GPS satellite constellation. The client used it to obtain more exact positioning for workers who were attempting to locate underground utilities. After much digging and poring over the obscure technical diagrams, we discovered that this particular class of satellites did not encrypt GPS communications to a civilian end-user device — and it never would.

The constellation was not designed to support cryptography or data security of any kind. However, the communications channels were steady and reliable. The constellation was accessible to nearly every hand-held GPS device on the market. The coordinates were seldom (if ever) off — not even by a meter.

This widely used GPS platform had no encryption at all. But how did the communications stay so reliable for over three decades — with no service disruptions, successful hacks that made headlines or global “blackouts”?

Data Security: Lessons From the Cold War

The answer appears to be relatively simple (and is the hallmark of a beautiful mind): All GPS communications are split up into chunks of data and hidden in thousands of bytes worth of garbage. The receiving device is programmed to use the proper phase shift to pick the real data out of the garbage pile and decode the satellite signals.

The design is akin to hiding a needle in a haystack — and it’s brilliant.

This type of out-of-the-box thinking is typical of countries that had little money to spend on scientific advancement during the Cold War but still had a need to maintain a very high level of security. It’s also something the data security field needs to expand.

Why? Many people are paranoid about data in today’s world: personally identifiable information (PII), protected health information (PHI), payment card industry (PCI) data, social media data, voter data, political donor data, data about the dog, data about the cat — the list goes on. Now with the General Data Protection Regulation (GDPR), it’s all about the “right to be forgotten” — the right to be scrubbed from the historical pages of social media. (Let’s all ignore the fact that social media is used voluntarily.) Every time another set of regulations is rolled out, security professionals have yet another hoop to jump through.

Meanwhile, criminal cartels and crime syndicates just laugh (and laugh) because they don’t think like everyone else — and they don’t play by the rules.

Creative Security Data Strategies

What if today’s data security practices could be changed by a bit of Cold War-era creative thinking? What if data securrity could be personalized and tailored to each individual?

Consider the following security data strategies:

  • Dodgy data scrapers: If governments are going to regulate access to PII, PHI, PCI and the like, then they should also regulate the companies that scrape this data and publish it to search engines. It is an entirely worthless practice to demand companies and hospitals protect the personal information of employees when anyone can type a name into Google and get at least 15 entries from information brokerages that are more than happy to cough up the goods on anyone for $14.95.
  • Secure smartphones: Smartphones should be capable of supporting Vernam ciphers, which is the principle behind a one-time pad, for simple text messaging. Now, there isn’t anything wrong with the SMS equivalent of pretty good privacy (PGP), but one-time pads are the most secure of all. One-time pads could also work on offline personal storage.
  • Homegrown solutions: Homegrown cryptography is generally not a good thing to have in a corporate environment. But considering the increasing number of hacks that target encryption mechanisms, this may become a plausible solution for specific enterprises.
  • Dumpster-worthy data: Cloud providers could scramble all of the customer data that is stored in their cloud by merely dumping several terabytes of garbage into it. Then, if an attacker wants to get at the data, they are going to work for it. Where do you get all that dumpster-worthy data? Simple: Scrape social media posts and sentiments for one week — there will be enough useless information to deter anyone.
  • Extreme data access: Self-brokered data auctions are extreme, but they might be lesser than other evils. The next time a major corporation is hit with ransomware and held hostage, they could just publish all their customer data to the world. There is nothing to hold hostage if everyone has equal access to the data.
  • Empowered drivers: Give consumers who purchase connected cars (i.e., cars that connect to a personal device via Bluetooth or Wi-Fi) the ability to both secure and scramble the data channel between the car and device.
  • Trash brokers: Hire a garbage data broker. If an individual needs to hide personal information (perhaps they have a high-profile job that requires exceptional cybersecurity), have your garbage data broker flood the online market with so much false information that attackers would have to construct algorithms to pick through it all.

Academic institutions, although beneficial for researching new technologies and methods, seldom send researchers outside the polished halls to have discussions with businesses and consumers. It’s only a matter of time before every mathematical method for encrypting and decrypting data is discovered — and methods for constructing and passing synchronous or asynchronous keys is known.

Quantum cryptography, lattice-based cryptography or homomorphic encryption might be the answer. Then again, maybe not. The answer to the future of data security may lie in scientific fields that are not yet known.

Kelly Ryver

Management and Strategy Consultant, IBM

Kelly is a management and strategy consultant with over 20 years of consulting experience ranging from security...