The more data in one place, the more data it attracts.

This “data gravity” is a familiar function for enterprises, even if the term isn’t. As the number of applications hosted on local servers increases, so too does the amount of data necessary for them to operate. Add more data and more applications are required to manage this data. Over time, the cycle repeats again and again as data gravity builds.

Now, this gravity is shifting to the cloud. With companies making the move to cloud storage, analytics and compute services, the volume of data — and its commensurate gravity — is on the rise. But are the very same clouds designed to boost performance at risk of becoming data black holes?

What is data gravity?

Coined in 2010 by Dave McCrory, data gravity is an analog for its physical counterpart.

In the world around us, large objects attract smaller masses. It’s why we don’t fly off the Earth and why the Earth rotates around the sun. Moving large objects away from a center of mass is difficult — this is why sending shuttles into space requires thousands and thousands of tons of rocket fuel to break free from our planet’s gravitational pull.

In the digital world, data gravity refers to the tendency of large data “masses” to attract and retain more data. For example, if a company uses a cloud-based ERP system, this system naturally attracts data related to customer histories, transaction details and key business operations. These data types are themselves governed by applications such as CRM solutions or eCommerce portals, which are carried along toward the data center. These applications also come with their own data and, in turn, the applications required to manage that data — and on and on it goes.

The result is a growing data mass that picks up attractive speed the more data it brings in. This mass also makes it prohibitively time and resource-expensive to run functions outside the center. Consider a security control located at the edge of company networks. Because data must travel back and forth between the control and the central storage mass, the time required to complete key processes goes up. In addition, data may be compromised on its journey to or from the center, in turn lowering the efficacy of these edge tools.

To address this loss of performance and increase in lag time, many companies are now centralizing key services in the cloud — creating even bigger data masses.

From shared responsibility to shared fate

In a shared responsibility model, ensuring cloud services were available and secure was the role of the provider. Cloud customers, meanwhile, were responsible for configuring and using the cloud — and for any issues that arose due to this configuration and use.

The problem? According to research firm Gartner, cloud customers are the primary driver of cloud security failures. In fact, Gartner estimates that by 2025, 99% of cloud security failures will be the customers’ fault.

To combat this challenge, companies like Google are moving to a “shared fate” model that takes a more active role in cloud configurations with guidance, tools and blueprints to help customers succeed. IBM, meanwhile, has developed solutions such as Continuous Cloud Delivery which help companies create and implement cloud application toolchains that enhance app management and ensure process repeatability.

While the primary impact of this effort is reduced cloud misconfigurations, it also comes with a knock-on effect: increased gravitational pull. If companies know that providers are willing to take on additional responsibilities for data protection and service operation, they’re more likely to accelerate their move to the cloud.

Laws of attraction: Navigating the cloud paradox

If enough physical mass is concentrated in one area, the pressure of the system converts it into a black hole. Not only do these holes in space consume everything around them, but once mass goes past the event horizon, there’s no coming back.

This is the cloud paradox. As providers recognize the shift toward all-in cloud models, they’re creating solutions that make it possible for enterprises to shift every aspect of their IT framework into the cloud. Underpinned by evolving solutions such as software-defined networking (SDN), powerful data analytics and artificial intelligence, it’s now possible for cloud services to outpace on-premises options when it comes to everything from security to performance to collaboration.

The challenge? The more data in the same place, the harder it is to leave. While storing services and data across multiple providers increases complexity, it also lowers the overall escape velocity of data. Put simply; it’s much easier for companies to leave a cloud they use only for a few services or applications. It’s much more difficult to make the switch if critical functions and data are housed in a single cloud. The time and effort required to move increase exponentially as more services are added.

Breaking free

When it comes to black hole clouds, the solution lies not in avoidance but in agility.

As noted above, cloud providers are moving to a shared fate model as they recognize the role of data gravity in enterprise operations. To bring businesses on board, they’re both reducing prices and improving performance, making data attractors hard to resist.

To make the most of these solutions without drifting past the point of no return, companies need to create comprehensive and consistent policies around which data and applications belong in large clouds, which are better served by specialized providers and should stay on-site. For example, a financial institution might shift its client demographic analysis to a large-scale cloud provider and make use of its computational power. The same company might procure cybersecurity services — such as threat intelligence and incident detection and response — from a provider that specializes in these solutions. Finally, they might opt to keep core financial data in-house and under lock and key.

This approach naturally creates a barrier against data attractors and helps companies resist the pull. It’s also worth digging into provider policies around data lock-in, such as any costs for removing data from cloud frameworks or limits on the amount of data that can be moved at once.

Put simply? Data gravity is growing. To avoid being caught in cloud black holes, enterprises need to identify their IT needs, determine the best-fit location for disparate data sources and deploy specialty providers where appropriate to keep operational data in orbit.

More from Cloud Security

How I got started: Cloud security engineer

3 min read - In today’s increasingly cloud-focused business environment, cloud security engineers are pivotal in protecting an organization’s critical data and infrastructure. As experts in cloud security, they leverage their expertise to ensure that the ever-expanding amount of cloud data is safe from emerging threats and vulnerabilities. Cloud security professionals combine their passion for technology with a deep understanding of security principles to design and implement robust cloud security strategies. What experience do these security experts have, and what led them to the…

“Authorized” to break in: Adversaries use valid credentials to compromise cloud environments

4 min read - Overprivileged plaintext credentials left on display in 33% of X-Force adversary simulations Adversaries are constantly seeking to improve their productivity margins, but new data from IBM X-Force suggests they aren’t exclusively leaning on sophistication to do so. Simple yet reliable tactics that offer ease of use and often direct access to privileged environments are still heavily relied upon. Today X-Force released the 2023 Cloud Threat Landscape Report, detailing common trends and top threats observed against cloud environments over the past…

Lessons learned from the Microsoft Cloud breach

3 min read - In early July, the news broke that threat actors in China used a Microsoft security flaw to execute highly targeted and sophisticated espionage against dozens of entities. Victims included the U.S. Commerce Secretary, several U.S. State Department officials and other organizations not yet publicly named. Officials and researchers alike are concerned that Microsoft products were again used to pull off an intelligence coup, such as during the SolarWinds incident. In the wake of the breach, the Department of Homeland Security…

What you need to know about protecting your data across the hybrid cloud

6 min read - The adoption of hybrid cloud environments driving business operations has become an ever-increasing trend for organizations. The hybrid cloud combines the best of both worlds, offering the flexibility of public cloud services and the security of private on-premises infrastructure. We also see an explosion of SaaS platforms and applications, such as Salesforce or Slack, where users input data, send and download files and access data stored with cloud providers. However, with this fusion of cloud resources, the risk of data…