In association withHitachi Vantara
Scott Sinclair wants to debunk two myths associated with cloud computing. The first is that cloud is a zero-sum game in which apps that once ran in the data center are simply relocated to the public cloud, says Sinclair, senior analyst at market research outfit Enterprise Strategy Group (ESG). The second is the idea that eventually all applications will run in the cloud, and data centers will be phased out.
IT strategies for hybrid cloud
“Digital demands are increasing so much that, no matter how fast the cloud is growing, people are still investing in their data centers,” Sinclair says. In ESG’s latest research on data infrastructure trends, respondents report the average expected growth rate for data in the public cloud was a staggering 39% year over year. But that doesn’t mean that the amount of data stored on-premises is declining. In fact, the estimated growth rate for data centers is comparable—35% year over year.
“If we think about a large modern enterprise, we may have two, three, four data centers; three, four, five public cloud providers; dozens, if not hundreds of edge locations,” says Sinclair. “And we have data moving and apps moving everywhere all the time.”
For example, the London Stock Exchange Group has dozens of data centers, hundreds of applications, and a presence in Amazon Web Services, Google Cloud, and Microsoft Azure, according to Nikolay Plaunov. He’s a director and technologist in the infrastructure and cloud division of LSEG, the diversified company that runs the stock exchange and also provides data-based financial services. Its portfolio includes virtualized applications running on-premises, containerized apps running in the cloud, and legacy apps running on mainframes.
“What is really hitting people today, versus probably five or 10 years ago, is this idea of, ‘I have these things in my data center, and I have these things I’ve moved to the public cloud and I need to manage a lot more things,’” adds Sinclair. “Now, I’m living in a world where not only do I have to manage a lot more things, but I am constantly dealing with data and apps moving in all directions.”
One of the most significant effects of the 2020 coronavirus pandemic from an information technology (IT) perspective has been the sudden, unplanned migration of applications to the cloud, as organizations moved quickly to accommodate remote workers and the surge of online shoppers. Today, companies find themselves with one foot in the cloud and the other still in the on-premises world, facing significant challenges in terms of how to manage this mixed IT environment, how to secure it, and how to keep costs under control.
A hybrid cloud IT infrastructure, in which resources are distributed across on-premises, private cloud, and public cloud environments, enables companies to accelerate time to market, spur innovation, and increase the efficiency of business processes. And companies are keen on its promises: more than a third (37%) say hybrid is an investment priority over the next year and a half, according to a 2021 ESG survey of 372 IT professionals.
But the complexity of managing a hybrid cloud presents challenges that can bedevil chief information officers, including compatibility with legacy equipment, cybersecurity concerns, and cost issues associated with moving data and managing data access.
To successfully manage a hybrid cloud environment, organizations need a specially designed hybrid cloud management plan that includes the right tools and strategies. These approaches can be as varied as the types of businesses out there, but some guidelines apply across industries—the need for a central control plane, for example, using automation to manage IT operations, and transitioning from managing infrastructure to managing service-level agreements with vendors.
It all starts with applications
Russell Skingsley, chief technology officer for digital infrastructure at Hitachi Vantara, says most customers started their cloud journeys with somewhat unrealistic expectations. They initially believed that all apps would eventually end up in the cloud.
What they’re finding is “there are things we can move, there are things we might move, and there are things we definitely can’t move,” Skingsley says.
Sinclair adds that while the rising tide is certainly lifting enterprise apps from the data center to the public cloud, there’s a countercurrent in which organizations are moving some applications from the cloud back to the data center. Some of the reasons cited by organizations speak to the complexity of hybrid cloud management: these include data sensitivity, performance, and availability requirements.
To effectively move applications to the public cloud, organizations need to set up a systematic methodology, almost a factory-style assembly line that analyzes each application in its portfolio and then decides which ones to “lift and shift” as-is to the cloud, which ones to re-factor or rewrite to take full advantage of the cloud, and which to keep on-premises.
The first step is conducting an inventory of the application portfolio. This can help organizations eliminate duplication and identify apps that no longer serve a business purpose and can be de-commissioned. The next step is to analyze applications through the lens of business outcomes. Then, organizations need to make decisions based on factors like time, risk, cost, and value.
At London Stock Exchange Group, Plaunov is constantly balancing cost with business criticality. Every application is different and requires its own specific calculation. “I’ve seen several applications that were lifted and shifted to the cloud, and in some cases, it’s relatively simple to optimize them and to optimize their costs.” In other cases, it can be expensive to convert a monolithic app to the public cloud because it entails breaking the app into smaller components.
The company’s risk management team analyzed its application portfolio and identified 14 high-priority apps in one of the business units. “If the application is business-critical and yet is running on obsolete infrastructure, then it’s an obvious choice to do something about it. And if you’re already budgeting for some changes to an application, if there are no regulatory or technological limits, then it’s a candidate to go to the public cloud.”
As more businesses deploy more internet-connected devices and sensors, they find themselves performing initial processing of some data at the edge, then moving relevant data to the cloud or a data center. Organizations need to deploy a data strategy that determines which data should be processed where, and how to most efficiently move data between nodes.
Ultimately, a hybrid cloud needs to become a flexible, resilient fabric that can accommodate shifting business requirements and react on the fly, handle spinning up new application instances as needed, with the underlying storage resources that provide data processing and analytics automatically responding to the business needs, says Skingsley.
Download the full report.
This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.
A chip design that changes everything: 10 Breakthrough Technologies 2023
Computer chip designs are expensive and hard to license. That’s all about to change thanks to the popular open standard known as RISC-V.
Modern data architectures fuel innovation
More diverse data estates require a new strategy—and the infrastructure to support it.
Chinese chips will keep powering your everyday life
The war over advanced semiconductor technology continues, but China will likely take a more important role in manufacturing legacy chips for common devices.
The computer scientist who hunts for costly bugs in crypto code
Programming errors on the blockchain can mean $100 million lost in the blink of an eye. Ronghui Gu and his company CertiK are trying to help.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.