The more data in one place, the more data it attracts.

This “data gravity” is a familiar function for enterprises, even if the term isn’t. As the number of applications hosted on local servers increases, so too does the amount of data necessary for them to operate. Add more data and more applications are required to manage this data. Over time, the cycle repeats again and again as data gravity builds.

Now, this gravity is shifting to the cloud. With companies making the move to cloud storage, analytics and compute services, the volume of data — and its commensurate gravity — is on the rise. But are the very same clouds designed to boost performance at risk of becoming data black holes?

What is data gravity?

Coined in 2010 by Dave McCrory, data gravity is an analog for its physical counterpart.

In the world around us, large objects attract smaller masses. It’s why we don’t fly off the Earth and why the Earth rotates around the sun. Moving large objects away from a center of mass is difficult — this is why sending shuttles into space requires thousands and thousands of tons of rocket fuel to break free from our planet’s gravitational pull.

In the digital world, data gravity refers to the tendency of large data “masses” to attract and retain more data. For example, if a company uses a cloud-based ERP system, this system naturally attracts data related to customer histories, transaction details and key business operations. These data types are themselves governed by applications such as CRM solutions or eCommerce portals, which are carried along toward the data center. These applications also come with their own data and, in turn, the applications required to manage that data — and on and on it goes.

The result is a growing data mass that picks up attractive speed the more data it brings in. This mass also makes it prohibitively time and resource-expensive to run functions outside the center. Consider a security control located at the edge of company networks. Because data must travel back and forth between the control and the central storage mass, the time required to complete key processes goes up. In addition, data may be compromised on its journey to or from the center, in turn lowering the efficacy of these edge tools.

To address this loss of performance and increase in lag time, many companies are now centralizing key services in the cloud — creating even bigger data masses.

From shared responsibility to shared fate

In a shared responsibility model, ensuring cloud services were available and secure was the role of the provider. Cloud customers, meanwhile, were responsible for configuring and using the cloud — and for any issues that arose due to this configuration and use.

The problem? According to research firm Gartner, cloud customers are the primary driver of cloud security failures. In fact, Gartner estimates that by 2025, 99% of cloud security failures will be the customers’ fault.

To combat this challenge, companies like Google are moving to a “shared fate” model that takes a more active role in cloud configurations with guidance, tools and blueprints to help customers succeed. IBM, meanwhile, has developed solutions such as Continuous Cloud Delivery which help companies create and implement cloud application toolchains that enhance app management and ensure process repeatability.

While the primary impact of this effort is reduced cloud misconfigurations, it also comes with a knock-on effect: increased gravitational pull. If companies know that providers are willing to take on additional responsibilities for data protection and service operation, they’re more likely to accelerate their move to the cloud.

Laws of attraction: Navigating the cloud paradox

If enough physical mass is concentrated in one area, the pressure of the system converts it into a black hole. Not only do these holes in space consume everything around them, but once mass goes past the event horizon, there’s no coming back.

This is the cloud paradox. As providers recognize the shift toward all-in cloud models, they’re creating solutions that make it possible for enterprises to shift every aspect of their IT framework into the cloud. Underpinned by evolving solutions such as software-defined networking (SDN), powerful data analytics and artificial intelligence, it’s now possible for cloud services to outpace on-premises options when it comes to everything from security to performance to collaboration.

The challenge? The more data in the same place, the harder it is to leave. While storing services and data across multiple providers increases complexity, it also lowers the overall escape velocity of data. Put simply; it’s much easier for companies to leave a cloud they use only for a few services or applications. It’s much more difficult to make the switch if critical functions and data are housed in a single cloud. The time and effort required to move increase exponentially as more services are added.

Breaking free

When it comes to black hole clouds, the solution lies not in avoidance but in agility.

As noted above, cloud providers are moving to a shared fate model as they recognize the role of data gravity in enterprise operations. To bring businesses on board, they’re both reducing prices and improving performance, making data attractors hard to resist.

To make the most of these solutions without drifting past the point of no return, companies need to create comprehensive and consistent policies around which data and applications belong in large clouds, which are better served by specialized providers and should stay on-site. For example, a financial institution might shift its client demographic analysis to a large-scale cloud provider and make use of its computational power. The same company might procure cybersecurity services — such as threat intelligence and incident detection and response — from a provider that specializes in these solutions. Finally, they might opt to keep core financial data in-house and under lock and key.

This approach naturally creates a barrier against data attractors and helps companies resist the pull. It’s also worth digging into provider policies around data lock-in, such as any costs for removing data from cloud frameworks or limits on the amount of data that can be moved at once.

Put simply? Data gravity is growing. To avoid being caught in cloud black holes, enterprises need to identify their IT needs, determine the best-fit location for disparate data sources and deploy specialty providers where appropriate to keep operational data in orbit.

More from Cloud Security

Autonomous security for cloud in AWS: Harnessing the power of AI for a secure future

3 min read - As the digital world evolves, businesses increasingly rely on cloud solutions to store data, run operations and manage applications. However, with this growth comes the challenge of ensuring that cloud environments remain secure and compliant with ever-changing regulations. This is where the idea of autonomous security for cloud (ASC) comes into play.Security and compliance aren't just technical buzzwords; they are crucial for businesses of all sizes. With data breaches and cyber threats on the rise, having systems that ensure your…

Risk, reward and reality: Has enterprise perception of the public cloud changed?

4 min read - Public clouds now form the bulk of enterprise IT environments. According to 2024 Statista data, 73% of enterprises use a hybrid cloud model, 14% use multiple public clouds and 10% use a single public cloud solution. Multiple and single private clouds make up the remaining 3%.With enterprises historically reticent to adopt public clouds, adoption data seems to indicate a shift in perception. Perhaps enterprise efforts have finally moved away from reducing risk to prioritizing the potential rewards of public cloud…

AI-driven compliance: The key to cloud security

3 min read - The growth of cloud computing continues unabated, but it has also created security challenges. The acceleration of cloud adoption has created greater complexity, with limited cloud technical expertise available in the market, an explosion in connected and Internet of Things (IoT) devices and a growing need for multi-cloud environments. When organizations migrate to the cloud, there is a likelihood of data security problems given that many applications are not secure by design. When these applications migrate to cloud-native systems, mistakes in configuration…

Topic updates

Get email updates and stay ahead of the latest threats to the security landscape, thought leadership and research.
Subscribe today