Computing architectures need to mix and match data, workloads, and their environments along the center-to-edge spectrum.
The edge has become indistinguishable from the center – it’s all a converged computing environment. Now, it’s a matter of deciding how to allocate workloads appropriately across what has become a wide spectrum of computing, enabled by emerging high-speed 5G connections.
That’s the gist of an industry panel discussion earlier this year involving representatives of telecom and infrastructure providers. To see how far we’ve progressed with this convergence, take a look at where workloads are going, says David Shacochis, VP of enterprise technology and field CTO for CenturyLink. “We’re starting to think about the workload as a pretty converged term – whether we’re talking about a virtualized network function, whether we’re talking about a more containerized network function running as a loosely coupled service, or whether we’re talking about a more traditional IT workload and all the different places that applications and business logic can run.”
The challenge, Shacochis says, is “figuring out all the different types of workloads and where they need to be running, and what sorts of types of business outcomes they can enable by putting the right workload on the right computing venue, connected to the right network, and then orchestrated.”
Along with workloads, data also is being generated and residing across all environments. “What’s going to really drive a lot of this is where the data has been created and how you’re going to use that data,” says David Lopez Meco, manager, connectivity innovation; project lead, enterprise networking, and edge computing for Telefónica. “There’s going to be an increasing trend that data is created and needs to be used outside of the data center or outside of the public cloud. The data can be created in shop floors in manufacturing and other locations.”
Tremendous amounts of data will be created at the edge, says another panelist, Brian Lappin, head of product management, BT, who points out the different ways customers are going to try and use data at the edge will differentiate their capabilities. “If it’s a pop-up store or retail outlet or a manufacturing outlet or a distribution center, your data is going to be key,” he says. Add to this new capabilities such as artificial intelligence and machine learning, “which allow you to analyze that data and use it in real time.
The enterprise edge “is going to be a key part of the customer’s environment, and the innovation around that, and how customers can use data, is really going to drive how we architect our network and how we architect our services to address those challenges and new requirements that customers have,” says Lappin.
Tomorrow’s computing architectures need to be adaptable enough to enable the opportunities to mix and match data, workloads, and their environments along the center-to-edge spectrum. “That means you really need an architecture that can adhere to what people really need to do, depending on their business requirements,” says Bob Ghaffari, general manager of enterprise and cloud networking at Intel Corporation. “Enterprises are going to need to think about this whole concept of what you place on your enterprise edge, versus what you do locally in your local cloud, versus what you bring into the public cloud. It’s important to have an architecture that can address this.”