3 Laws Disrupting Data Management

Economics, physics and laws of the land are bearing down on monolithic information systems, forcing new distributed approaches to managing data.

By Damon Brown

By Damon Brown May 3, 2019

Moore’s Law hit an unbelievable stride for decades, doubling the number of transistors on an integrated circuit about every two years. That seminal observation dictated the pace of tech innovation, which brought massive computing capabilities at a lower and lower cost. This led to a world powered by data, where a different set of laws are simultaneously constraining and liberating how data systems are built.

“Location of the data is becoming more important than the location of compute systems, so we have to think about putting compute capabilities where the data is created,” said Dheeraj Pandey, co-founder, CEO and Chairman of Nutanix. Increasingly the laws of economics, laws of physics and laws of the land are challenging long-standing paradigms for managing data.

Pandey was inspired by Akhil Gupta, VP of Infrastructure of Dropbox, a pioneer in consumer-grade public cloud service. In 2017, Gupta talked about why CIOs need to control certain types of data differently. These three laws are driving many to a hybrid-cloud model, which allows them to more easily store data in a private or public cloud in different locations.

“There isn’t a one-size-fits-all approach to deploying cloud computing,” said Gupta in a Q&A titled Cloud Service Broker: The New Role for IT published in Next Magazine. “Users with workloads that are subjected to more regulations will inevitably gravitate toward private cloud – where their data is stored in a single-tenant environment on servers in third-party data centers – as a way to limit risk. For users looking to take advantage of the scalability, cost and productivity benefits that cloud offers, public cloud services are probably the best solution.”

More than 2.5 quintillion bytes of data is generated every single day. Increasingly, the laws of economics, physics and national sovereignty are constraining old approaches to managing information with large data centers and giant cloud services, according to Pandey.

He said privacy, affordability and the skyrocketing amount of data being created in the world are forcing a shift from monolithic, centralized data systems to a more dispersed or distributed federation approach to managing data. This could mean companies will have mini data centers and a variety of cloud technologies located in places where personal and business data is generated and used.

The Laws of the Land

Data center and cloud computing location is becoming more important due to national and regional regulations like The European Union’s General Data Protection Regulation (GDPR). Data protection, privacy and other new compliance rules around the world are limiting how data is collected and used and where data can and can’t go. Failure to follow these laws can result in class action lawsuits and enormous fines.

Today, there is not much control over where data resides when it’s in the public cloud, said Nutanix CTO of Cloud Platforms, Rajiv Mirani. That needs to change.

“Where is the cloud service data center?” he said, starting a line of questioning IT managers need to ask to ensure they’re abiding by data protection laws. “Where does the backup for that data center reside? Who can look at the data? There’s just not a lot of control over what happens once data is in the public cloud. And, until that’s addressed, CIOs feel that they just don’t have enough control to meet their compliance objectives.”

Laws of the land dictate sovereignty, compliance and security rules, said Pandey. The best way to help companies manage data properly is to move cloud computing into areas where the laws have jurisdiction.

“More things need to be atomized,” he told IT experts gathered at .NEXT in London in late 2018. “More things need to be miniaturized. More things need to be dispersed. The one-size-fits-all approach worked fine for the first generation of cloud computing. Now, the cloud must adapt to the different needs of both consumers and creators.”

Pandey sees today’s centralized monolithic cloud systems breaking into smaller, decentralized and more dynamic sets of platforms. This will lead to better, more affordable and secure cloud services.

He saw this process played out by previous technologies, when computing evolved from desktops into wearables and from mainframes to serverless networks. He is convinced that public clouds dispersed across different parts of the world will allow companies to more easily abide by local laws and grow.

The Laws of Economics

The law of economics coupled with the rapid pace of technology innovation are driving many to explore the benefits of renting versus owning compute infrastructure. Many turned to public cloud for easy accessibility and because it’s an operational rather than capital cost. But often cost of public cloud have people wondering if it makes sense to run workloads on rented or owned infrastructure, especially as owned private cloud infrastructure is becoming more pervasive and able to interoperate with public cloud. This hybrid approach provides freedom of choice to run workloads in public or one private cloud, wherever performenced needs are met and whichever makes economic sense.

Cloud services will need to adapt, evolve and disperse to democratize cloud computing and make it more affordable to more people around the world, according to Pandey. He said today’s cloud is monolithic and only serves the rented model.

“Today, it has to be a large data center that requires a billion dollars in investment,” he said. “The customer must come to this gigantic, Death Star-like source, and there is no other way to consume it.”

Instead, Pandey sees a future where computing resources are everywhere like “white noise.” Like a constant, accessible hum, technology will be invisible and available virtually anywhere.

To get there, the behemoth cloud services of today will be joined by many smaller clouds that cost a few million rather than billions of dollars to build.

“The whole centralized concept of everything going to a few data centers is just going away, as controlling data that’s spread across the world becomes more complex,” Mirani said.

Instead of building big central data centers, Mirani wants to help cloud service providers build their presence in different countries so they can give customers more control over where data can go.

The Law of Physics

Lastly, the convergent clouds tie into the third tenent of modern computing: the law of physics. If the past is proof, the data demand exceeds the ability of available pipelines. This data gravity is a constant struggle. The demand for more data is unlikely to level off, ever, according to Pandey.

“It’s important to understand that you will always be network-challenged,” he said. “There is always a point where the data will grow at a faster rate than the pipe.” He said the best way to solve that problem is to move the compute to the data rather than the data to the compute.”

In a world increasingly driven by data, the laws of the land, economics and physics make things more difficult and more expensive to manage. But these three laws are also pointing the way to a future where computing becomes more secure, affordable, pervasive, almost invisible and as easy to consume as a breath of fresh air.

Damon Brown is a contributing writer. He writes a daily column for Inc. Follow him on Twitter @browndamon.

© 2019 Nutanix, Inc. All rights reserved. For additional legal information, please go here.