Operate and Optimize Consistent Edge-Native Infrastructure with Simplicity, Security, and Flexibility
Communication services providers (CSPs) and enterprises are poised to radically expand their edge computing to capitalize on emerging trends and to position themselves for the widespread adoption of 5G and the Internet of Things.
CSPs aren’t the only ones looking to capitalize on opportunities at the edge. Organizations are distributing workloads across multiple clouds and charting their strategy to seize the edge. Indeed, at the dawn of 5G, the rise of highly distributed work forces, wide-spread digital transformation, ubiquitous cloud services, and low-latency applications are compelling CSPs and enterprises to expand into the edge now or lose market share later. The work anytime, anywhere lifestyle of post-Covid consumers is demanding compelling, interactive content and immersive computer experiences. With these and other forces driving rapid change, some industry watchers predict the edge Internet economy to surpass $4 trillion by 2030.
Disruption across multiple industries is once again provoking transformation. Retail, manufacturing, and health care, among other sectors, are on the cusp of innovative uses cases for edge computing. Customized shopping and immersive retailing highlight the importance of localized data for retailers. AR/VR, trackers, wearable devices, and other innovations are driving remote and autonomous operations for manufacturing and AI-driven operations for medicine. Other industry sectors are shifting gears to support trends like connected vehicles, immersive gaming, collaborative robots, and drone fleets.
Different Challenges and Requirements at the Edge
Expansion to the edge brings new challenges and infrastructure requirements. Service providers are investing in the virtualization of their core network infrastructure and 5G connectivity to enable new, distributed services.
Enterprises, meanwhile, are redesigning their infrastructure and adopting Secure Access Service Edge (SASE) architectures for cloud-delivered networking and security. SD-WAN and SASE represent critical virtual networking technologies for edge applications. The security, resiliency, and session-awareness that these technologies bring to enterprise connectivity are equally applicable and foundational to use cases hosted at the network edge.
But edge computing harbors some differences, and any solution that enables edge computing needs to address these differences. The highly distributed model of edge computing can bump up against common-sense requirements for centralized management. A new type of workload is emerging – edge-native apps – that must run at the edge to perform as intended. How can a centralized model work in a distributed world?
Making a Difference in an Edge-Native World
Distributed infrastructure for running workloads across edge locations and data endpoints makes running and automating applications at the edge different from doing so in the cloud or a data center:
- The number of edge nodes and endpoints can be greater than a typical cloud deployment.
- Some edge sites can be in remote locations subject to severe weather, natural disasters, and difficult on-site access.
- Network connections to edge sites can be slow and unreliable.
- The resources available to edge nodes can be constrained by a variety of factors, including space and power.
- Data-intensive low-latency applications can require co-located data or data that is as close as possible.
- Edge sites are less trusted environments, or even trust-nothing environments, than servers in the cloud or in your data center.
- Bad actors might seek to gain physical access to edge sites or boxes on remote towers, and maintaining physical security might be difficult.
To address the differences between the edge and the cloud, edge-native applications will bring some new requirements or, at least, extend some existing cloud-native requirements to their extremes.
Where a workload is placed at the edge is key to meeting requirements, and a distinction between near and far edge can help identify a workload’s requirements.
- An edge-native workload placed anywhere between the cloud and a remote customer location and delivered as a service is called the near edge.
- An edge-native workload placed at a remote customer location at the closest proximity to the endpoints is called the far edge.
Across the edge, several edge-native characteristics will prevail, and differences between far edge and near are likely to be a matter of degree: Either way, being fast, light, portable, and resource optimized is key.
Edge-native applications will include some of the following imperatives, depending on the edge site, and there are likely to be differences across these requirements for near edge vs. far edge.
- Right-size the infrastructure and components of edge sites for their purpose while retaining the flexibility to change the size to accommodate new services.
- Be able to automatically optimize compute and networking capacity for edge applications.
- Centralize and automate operations and maintenance to the maximum.
- Containerize services for portability; deconstruct them into loosely coupled, independent microservices; and orchestrate them with Kubernetes.
- Use standard APIs that are discoverable yet able to be protected in a highly distributed environment.
- Protect data privacy by segmenting and isolating workloads, and use micro-segmentation to protect micro-services.
- Plan for elasticity and failover not only within an edge site, but also across edge sites, in case an entire site goes down.
- Use familiar management tools so that various vendors and contractors can easily address issues, preferably from a centralized location but also with on-site visits when the need arises.
New Edge Reality Brings New Requirements
But these low-level imperatives at the edge must also complement, or at least work in tandem with, a common set of higher-level business requirements that will drive innovation and success across a large, distributed edge.
To meet the demands of customers at the edge amid a rapidly changing world, CSPs and enterprises share these common high-level requirements:
- Agile modern application development practices that adapt to business logic.
- Real-time data processing at the edge to act quickly and securely across a vast, heterogeneous environment.
- Workload placement at locations that best meet their business and security needs while optimizing network traffic, minimizing data movement costs, and streamlining operations.
- Multi-cloud capabilities and unified management with the ability to shift workloads across locations.
- Zero-touch provisioning to support remote deployment to non-traditional endpoints at scale.
- Zero-trust security that is built into all the layers of multi-cloud edge computing and low-latency applications.
- Automation to manage a vast edge network at scale and, as much as possible, from a centralized location.
- Elasticity to dynamically and intelligently scale edge infrastructure and applications to meet changes in demand across locations.
- Portability to easily migrate applications across clouds and edge locations with automation.
- Fast and light: Use containers and Kubernetes to streamline development, deployment, and orchestration of edge applications.
- Minimize the environmental impact of edge sites and their equipment.
- And, finally, perhaps the most important requirement for edge-native infrastructure: simplicity.
For both CSPs and enterprise, business needs will be a primary driver of ways to monetize the edge, such as improving operational efficiency, elevating customer experiences, or fostering new revenue models.
In general, then, to meet these business needs, cloud-native applications require a multi-cloud edge that takes foundational underlay services running on a service provider network, such as private connectivity and multi-access edge computing, and fuses them with overlay services like secure access service edge (SASE).
Common, Consistent Infrastructure across Core and Edge
When it is flexible enough to be right-sized for edge workloads, consistent infrastructure lights up the trail to deploying, automating, and optimizing edge-native services. And when all the edge services, including both the foundational underlay and the enabling overlay, can be orchestrated by a management plane that provides consistent installation, configuration, observability, and management across all edge locations, the result is a solution that embodies simplicity, built-in security, freedom, and flexibility.
Introducing VMware Edge
VMware Edge brings together products from across VMware to enable organizations to run, manage, and secure edge-native apps across multiple clouds at both near edge and far edge locations. VMware Edge solutions include:
- VMware Edge Compute Stack, unveiled at VMworld 2021, is a purpose-built, integrated VM and container-based stack that enables organizations to deploy and secure edge-native apps at the far edge. VMware Edge Compute Stack will be available in Standard; Advanced; and Enterprise editions. VMware also has plans to develop a lightweight version of VMware Edge Compute Stack to provide an extremely thin edge for more lightweight apps.
- VMware SASE combines SD-WAN capabilities with cloud-delivered security functions, including cloud web security, zero-trust network access, and firewalling. These capabilities will be delivered as-a-service across both the near and far edge locations from a global network of points of presence (PoPs).
- VMware Telco Cloud Platform has been delivering near edge solutions to the largest communication service providers in the world from their 4G and 5G core all the way to the radio access network (RAN). By helping service providers modernize their network underlay, VMware enables them to deliver overlay edge services to their consumer and enterprise customers.
Building a Broad Edge Ecosystem
VMware has key partnerships across the broad edge ecosystem to deliver seamlessly integrated solutions. Its broad partner ecosystem spans public cloud providers, service providers, edge-native app developers, network services providers, system integrators, network equipment providers, near-edge hardware manufacturers, and far-edge hardware manufacturers.
For example, VMware Edge Compute Stack is compatible with Dell EMC VxRail, specifically the variation that incorporates the Dell EMC PowerEdge XE2420 server platform. Dell EMC VxRail provides an efficient and agile IT infrastructure that enables automated operations capable of stretching from data centers to cloud and edge environments.
Similarly, VMware Edge Compute Stack will be able to run on top of Lenovo’s ThinkSystem SE350 Edge Servers. The integrated solution, previously announced by Lenovo, is ideal for remote sites that need to be able to process data closer to its creation and closer to users, including retail stores, manufacturing sites, and schools, to name a few.
Simplifying the Path to Edge Expansion and Monetization
VMware solutions for the edge, both near and far, help you deploy, optimize, and protect edge applications so that you can monetize the edge by supporting emerging use cases. Here are some of the benefits of VMware Edge:
- Consolidate and simplify infrastructure to streamline operations and minimize costs as you expand to the edge
- Provide insights to optimize edge infrastructure and applications
- Ensure flexibility to address news use cases
- Promote business agility
- Eliminate silos and fragmentation
- Support multi-domain workloads and cross-cloud applications
VMware Edge helps you tackle the multi-cloud era and turbocharge innovation at the edge, both near and far, with simplicity, security, and flexibility.
Be sure to join us at VMworld 2021, running virtually from October 5th through the 8th. We have a dedicated track to help you get the most of the Edge, including nearly 100 sessions and over 10 demos. Take a look and let us know if you’d like to discuss your edge.
To learn more, about VMware Edge Compute Stack, please review the press release and visit the website.