Moving to an edge architecture requires managing costs, orchestration, and security challenges.
By now, you’ve likely heard about edge computing — meaning the deployment of data or applications at the network “edge,” close to end-users — and the benefits it offers over conventional cloud computing.
But if you set about trying to figure out how to move to an edge architecture, you may well find that actually taking advantage of edge computing is much harder than talking about the reasons why you should.
Indeed, there remain a number of barriers to edge computing adoption for many businesses. Solutions are available, but recognizing the challenges in edge computing is the first step in planning an edge strategy that actually works.
Cost
First and foremost is the cost of deploying applications at the edge.
Hosting applications or data in a traditional cloud data center is relatively cheap. But setting up your own network of local data centers in order to deploy workloads at the end is likely to be very costly (not to mention logistically complicated).
Public cloud vendors are working to address this challenge by offering services, like AWS Snowball, that are designed to make it easier for customers to move cloud workloads to local sites. But those solutions still cost a fair amount of money and do not necessarily fit within the budgets of smaller organizations.
That may change as edge grows more popular and solutions become cheaper, but for now, there’s no denying that edge computing can be expensive.
Edge computing orchestration
How do you manage workloads that are spread across a distributed edge network of far-flung servers? That remains an open question.
You could try to use an edge management service from a public cloud vendor, but they tend to support only certain types of edge workloads or devices. You could also use a platform like Kubernetes, which excels at managing distributed workloads. But edge orchestration is not Kubernetes’s primary use case, and you’ll need to invest some time and effort in setting it up for the job.
In short, there remains no easy, fast solution for orchestrating edge workloads.
Edge computing security issues
Keeping workloads at the edge secure can be more challenging than securing those that reside in central data centers. Not only is physical security harder to achieve when you have devices spread out over a large area, but it may also be more difficult to apply security safeguards to devices like IoT hardware that you deploy at the edge. You must also manage security as data moves across your edge network, where it may have more exposure to threats than it would if it stayed within a cloud data center.
Here again, these are all challenges that can be overcome, but there is no easy solution. Businesses must plan carefully to meet the unique security requirements of edge workloads.
Supporting widely dispersed users
Edge computing may be easy to implement if all of your users live in the same city or are concentrated in a few distinct regions. You deploy workloads close to those users and call it a day.
But what if your customer base is spread across many countries? The more dispersed your users are, the harder it becomes to set up edge infrastructure that supports all of them equally well.
This may mean that businesses need to compromise when it comes to the edge. They may have to think strategically about which regions would benefit the most from edge deployments based on the numbers of users located in those regions and which will have to settle for traditional architectures.
Conclusion
Moving to an edge architecture can be harder than all of the heady conversations about edge computing would suggest. You need to manage edge computing costs, orchestration, and security challenges, while also figuring out how to reach your target users efficiently through an edge architecture. These challenges can all be addressed, but they lack a simple solution.