If you clicked through on the last post (There is No Spoon), you’ll see that Paul, our intrepid blogger, finished his rant with:
The Von Neumann Architecture is not so much going to die as it is going
to replicate itself so many times it is going to force us to consider
other, more simple and basic ways to configure and run these things we
Virtualization introduces a layer of abstraction that turns the question around from "let’s see what resources are available and figure out if we can adapt our problem to use them" to "here is an environment I need to solve my problem — I want to have it deployed on the grid as described." For a user this is a much simpler question. The issue is whether we can implement the middleware that will map such virtual workspace onto physical resources. One way to implement it would be to provide an automated environment installation on a remote node.
But what really gives this idea a boost is using virtual machine technology to represent such a workspace. This makes the environment easy to describe (you just install it), easy to transport, fast to deploy and, thanks to recent research, very efficient. Best of all, virtual machine management tools nowadays allow you to enforce the resource quantum assigned to a specific virtual machine very accurately — so you could for example test or demo your application in a virtual cluster making sparing use of resources, and redeploy the virtual cluster on a much more powerful resource for production runs. This is another powerful idea behind virtualization: the environment is no longer permanently tied to a specific amount of resource but rather this resource quantum can be adjusted on-demand.
See also the Grid-Appliance from the University of Florida, which won an Honorable Mention in the Ultimate Virtual Appliance Challenge earlier this year. Expect to see much more about resource pools, utility computing, and virtualization over the next year.