Global communication network concept. Worldwide business. IoT (Internet of Things) .
Home Page VMware Cloud Foundation

Analyst Insight Series: Virtualization virtue #3: Supporting application modernization

Guest post by Jean Atelsek, Senior Research Analyst,S&P Global Market Intelligence 451 Research

This blog is the third in our series on the benefits and trends of virtualization (read blog#1 and blog#2 here) and a companion to the 451 Research Business Impact Brief “The virtues of virtualization.”

Application modernization means different things to different people, but its universal goal is to make information technology more responsive to business needs. Virtualization – the ability to run multiple operating systems on a single physical machine – planted the seed for the cloud computing paradigm that underlies most modern software development, and it also represents a critical bridge between legacy environments and the container-based and AI-driven deployments revolutionizing how we interact with IT.

What does a modern workload look like? Microservices-based architectures, which put software functionality in small, flexible and portable application containers, are one hallmark of cloud-native deployments. 451 Research expects this market to grow at a 15% CAGR to reach $18 billion by 2029. The boom in AI/ML-based workloads is a significant contributor to this forecast.

One big challenge of deploying AI workloads is optimizing usage of the powerful graphics processing units (GPUs) needed to efficiently run them. The latest GPUs from NVIDIA, AMD and Intel have been hard to come by due to high demand, and they are expensive when they do become available, with prices reaching up to $40,000 each. The last thing a company wants after provisioning such pricey silicon is to have it sitting idle in a data center waiting to produce a return on the investment.

The versatility of virtual machines (VMs) lets organizations “sweat the assets” of their IT environment no matter how heterogeneous the devices supporting it. Decoupling applications from the underlying hardware makes it possible to run software seamlessly across different machines and operating systems. With modern workloads and larger datasets, precious GPU capacity can be accessed and shared more effectively because GPU functionality can be sliced up and distributed at scale to support a variety of programs.

But AI-driven applications are inherently modern. Enterprises are also investing to update their legacy software, which may be piling up technical debt as engineers spend time on the care and feeding of custom-built mission-critical code. Such software becomes increasingly brittle after decades of patches, maintenance and operating system updates, but businesses are motivated to undertake the difficult work of modernizing them, especially as GenAI promises to do more of the heavy lifting required. In a recent survey of current or prospective cloud users, respondents cited three top drivers behind their app modernization plans: improving application performance and reliability (33%), reducing IT operations and maintenance costs (30%) and improving the customer experience (29%).   

One key approach to modernization is to put programs (or functionally independent parts of them) into containers – lightweight software packages that bundle an application and its dependencies into a full runtime environment. Containers can be orchestrated as fleets to balance load across the network, apply security policies and maintain a desired configuration consistently. Many businesses run containers in or on VMs because of their isolation and security advantages (see blogs #1 and #2). A system for operating, updating and monitoring VM and container fleets puts platform engineering and DevOps teams on the same page and reduces friction in the deployment process.

When it comes to hardware upgrades, enterprises don’t have the luxury of starting with a clean sheet of paper when their servers, storage and networking equipment need to be refreshed – “keeping the lights on” is a necessity for mission-critical workloads that may embody the core of a business’s DNA. The ability to replicate and test applications on net-new infrastructure without interrupting customer-facing services gives companies the assurance they need to swap out aging devices for more efficient and performant equipment. The same principle applies when upgrading or adopting new software: Valuable legacy applications may only be compatible with older operating systems, and virtualization makes it possible to keep these running dependably alongside newer software.

The benefits of virtualization extend into the application layer and beyond. By helping organizations adapt to ever-changing hardware and software systems to address business needs, virtualization technology paves the way for companies to remain competitive in disruptive times.

About the Author:

Jean Atelsek is a senior research analyst working across the Cloud & Managed Services Transformation channel and digital economics unit of 451 Research, a technology research group within S&P Global Market Intelligence. She covers vendors and technologies that manage or optimize public and private cloud total cost of operations, performance or consumption. This includes FinOps products, platforms and providers that help organizations forecast, analyze and optimize cloud spending based on data collected from the IT environment. ​In the cloud-native universe, Jean focuses on container-native software and platforms, serverless architectures, service mesh, and the converging worlds of observability, runtime policy enforcement and application networking. She also covers technology accelerators for application modernization, including the use of natural language processing and generative AI to effect code translation.


Discover more from VMware Cloud Foundation (VCF) Blog

Subscribe to get the latest posts sent to your email.