The more data in one place, the more data it attracts.

This “data gravity” is a familiar function for enterprises, even if the term isn’t. As the number of applications hosted on local servers increases, so too does the amount of data necessary for them to operate. Add more data and more applications are required to manage this data. Over time, the cycle repeats again and again as data gravity builds.

Now, this gravity is shifting to the cloud. With companies making the move to cloud storage, analytics and compute services, the volume of data — and its commensurate gravity — is on the rise. But are the very same clouds designed to boost performance at risk of becoming data black holes?

What is data gravity?

Coined in 2010 by Dave McCrory, data gravity is an analog for its physical counterpart.

In the world around us, large objects attract smaller masses. It’s why we don’t fly off the Earth and why the Earth rotates around the sun. Moving large objects away from a center of mass is difficult — this is why sending shuttles into space requires thousands and thousands of tons of rocket fuel to break free from our planet’s gravitational pull.

In the digital world, data gravity refers to the tendency of large data “masses” to attract and retain more data. For example, if a company uses a cloud-based ERP system, this system naturally attracts data related to customer histories, transaction details and key business operations. These data types are themselves governed by applications such as CRM solutions or eCommerce portals, which are carried along toward the data center. These applications also come with their own data and, in turn, the applications required to manage that data — and on and on it goes.

The result is a growing data mass that picks up attractive speed the more data it brings in. This mass also makes it prohibitively time and resource-expensive to run functions outside the center. Consider a security control located at the edge of company networks. Because data must travel back and forth between the control and the central storage mass, the time required to complete key processes goes up. In addition, data may be compromised on its journey to or from the center, in turn lowering the efficacy of these edge tools.

To address this loss of performance and increase in lag time, many companies are now centralizing key services in the cloud — creating even bigger data masses.

From shared responsibility to shared fate

In a shared responsibility model, ensuring cloud services were available and secure was the role of the provider. Cloud customers, meanwhile, were responsible for configuring and using the cloud — and for any issues that arose due to this configuration and use.

The problem? According to research firm Gartner, cloud customers are the primary driver of cloud security failures. In fact, Gartner estimates that by 2025, 99% of cloud security failures will be the customers’ fault.

To combat this challenge, companies like Google are moving to a “shared fate” model that takes a more active role in cloud configurations with guidance, tools and blueprints to help customers succeed. IBM, meanwhile, has developed solutions such as Continuous Cloud Delivery which help companies create and implement cloud application toolchains that enhance app management and ensure process repeatability.

While the primary impact of this effort is reduced cloud misconfigurations, it also comes with a knock-on effect: increased gravitational pull. If companies know that providers are willing to take on additional responsibilities for data protection and service operation, they’re more likely to accelerate their move to the cloud.

Laws of attraction: Navigating the cloud paradox

If enough physical mass is concentrated in one area, the pressure of the system converts it into a black hole. Not only do these holes in space consume everything around them, but once mass goes past the event horizon, there’s no coming back.

This is the cloud paradox. As providers recognize the shift toward all-in cloud models, they’re creating solutions that make it possible for enterprises to shift every aspect of their IT framework into the cloud. Underpinned by evolving solutions such as software-defined networking (SDN), powerful data analytics and artificial intelligence, it’s now possible for cloud services to outpace on-premises options when it comes to everything from security to performance to collaboration.

The challenge? The more data in the same place, the harder it is to leave. While storing services and data across multiple providers increases complexity, it also lowers the overall escape velocity of data. Put simply; it’s much easier for companies to leave a cloud they use only for a few services or applications. It’s much more difficult to make the switch if critical functions and data are housed in a single cloud. The time and effort required to move increase exponentially as more services are added.

Breaking free

When it comes to black hole clouds, the solution lies not in avoidance but in agility.

As noted above, cloud providers are moving to a shared fate model as they recognize the role of data gravity in enterprise operations. To bring businesses on board, they’re both reducing prices and improving performance, making data attractors hard to resist.

To make the most of these solutions without drifting past the point of no return, companies need to create comprehensive and consistent policies around which data and applications belong in large clouds, which are better served by specialized providers and should stay on-site. For example, a financial institution might shift its client demographic analysis to a large-scale cloud provider and make use of its computational power. The same company might procure cybersecurity services — such as threat intelligence and incident detection and response — from a provider that specializes in these solutions. Finally, they might opt to keep core financial data in-house and under lock and key.

This approach naturally creates a barrier against data attractors and helps companies resist the pull. It’s also worth digging into provider policies around data lock-in, such as any costs for removing data from cloud frameworks or limits on the amount of data that can be moved at once.

Put simply? Data gravity is growing. To avoid being caught in cloud black holes, enterprises need to identify their IT needs, determine the best-fit location for disparate data sources and deploy specialty providers where appropriate to keep operational data in orbit.

More from Cloud Security

New cybersecurity sheets from CISA and NSA: An overview

4 min read - The Cybersecurity and Infrastructure Security Agency (CISA) and National Security Agency (NSA) have recently released new CSI (Cybersecurity Information) sheets aimed at providing information and guidelines to organizations on how to effectively secure their cloud environments.This new release includes a total of five CSI sheets, covering various aspects of cloud security such as threat mitigation, identity and access management, network security and more. Here's our overview of the new CSI sheets, what they address and the key takeaways from each.Implementing…

Why security orchestration, automation and response (SOAR) is fundamental to a security platform

3 min read - Security teams today are facing increased challenges due to the remote and hybrid workforce expansion in the wake of COVID-19. Teams that were already struggling with too many tools and too much data are finding it even more difficult to collaborate and communicate as employees have moved to a virtual security operations center (SOC) model while addressing an increasing number of threats.  Disconnected teams accelerate the need for an open and connected platform approach to security . Adopting this type of…

Cloud security uncertainty: Do you know where your data is?

3 min read - How well are security leaders sleeping at night? According to a recent Gigamon report, it appears that many cyber professionals are restless and worried.In the report, 50% of IT and security leaders surveyed lack confidence in knowing where their most sensitive data is stored and how it’s secured. Meanwhile, another 56% of respondents say undiscovered blind spots being exploited is the leading concern making them restless.The report reveals the ongoing need for improved cloud and hybrid cloud security. Solutions to…

Topic updates

Get email updates and stay ahead of the latest threats to the security landscape, thought leadership and research.
Subscribe today