Pendulum swings are a common occurrence in the IT industry. Take what’s happening with edge infrastructure, which consists of bringing assets back from the cloud into network closets. You wouldn’t have edge computing without the cloud, since the cloud is what’s making management at the edge necessary.
To understand how this is so, let’s look back at how cloud infrastructures came about. First, you had centralized data centers where cloud providers such as Amazon, Google and Microsoft ran their services. These data centers were placed not in urban areas, but in more remote places with lower operational costs.
This worked out fine because consumers’ initial expectations for the cloud and services were somewhat limited. When Microsoft first launched Office 365, for instance, a little latency was tolerated, but as adoption rates grow and customers consume more and more business-critical applications in the cloud, the tolerance for latency quickly diminished. If enterprises plan to invest in the cloud, they want cloud assets to perform as well as if they’re inside the network.
Bringing It Back
So how do the cloud providers handle demands for better performance? They’re now opening regional data centers and co-location facilities in urban areas that are physically closer to clients but still tap cloud infrastructures.
Enterprises are installing on-premise stacks that depend on an organization’s setup, connect to the co-location facility, the regional data center or both. A version of the client’s apps will run locally to allow ready use without latency, with versions residing in the regional and central data centers for storage and less-immediate needs.
That’s what edge infrastructure is about. It takes advantage of cloud infrastructure but keeps assets at the edge of the network. This prevents bandwidth issues with legacy networks that can’t handle additional workloads. You’re bringing your computing assets – or a version of them – back home. Instead of letting them inside, you keep them at the door.
Solution Provider Opportunities
So why does this matter to solution providers? As organizations pull business-critical applications back from the cloud to the edge of their networks, they will need providers to support, maintain and secure those applications.
This is especially true if you have a managed services practice because it creates a substantial new revenue opportunity. As edge infrastructure assets are deployed, the stacks that hold them will reside in the networking, wiring and server closets that organizations thought they wouldn’t need anymore – or would need less – once they started investing in the cloud. Closets that had two-post racks now will grow to four-post racks.
As the pendulum swings back, companies are realizing that the edge infrastructure approach offers them the best of both worlds – you can still leverage the cloud but get the network performance you need to maintain productivity and efficiency. One industry where this development is already taking hold is retail. To learn how the edge and retail partnership is taking shape, take a look at my blog titled, How the Internet of Things and Edge Computing Will Help Revolutionize the Shopping Experience.
Now, even though the application servers will reside at the edge of the network, that doesn’t change the reason enterprises started investing in the cloud in the first place – to save on operational and staff costs and gain new efficiencies. The needs are the same but the approach changes.
To meet those needs and stay on budget, companies will farm out IT maintenance and management to the people who do it best – the solution providers. And that’s why you need to know about edge computing so you can speak intelligently to customers about their deployment strategies options. To learn even more about edge computing, access this free white paper, The Drivers and Benefits of Edge Computing.
For additional support and resources, join APC by Schneider Electric’s Partner Program.