Yet, the pace of information technology often forces IT executives to do that.
In today’s world, mainframe-to-cloud decisions need solid thinking or we risk a technology tornado. This article outlines some key lessons learned at the front-line of IT decision-making.
As previously discussed, it’s possible to “modernize” mainframe legacy applications to the cloud. You can get there with little to no modification by using a “lift-and-shift” strategy. Several of my clients have taken this approach to quickly satisfy a “cloud mandate”. The results have been less than desirable:
Without the use of pooled resources, the applications do not scale well.
Timely user provisioning and access from any device is still a challenge because the apps do not provide on-demand, ubiquitous access.
In addition, utility-based pricing/costing is performed manually, with little accuracy to the realities of actual usage.
Most importantly, the applications continue to have monolithic, stove-piped architectures, which are difficult and expensive to maintain and enhance.
These “cloud” applications are more like funnel cloud apps or tornoado apps—waiting to cause IT organizations extreme havoc. Assuming you want to avoid funnel clouds and IT tornadoes, consider applying the following five application architecture and design principles indicative of a true cloud application: Continue reading →
In a guest post today, David Klee, a solutions architect from House of Brick Technologies shares with us some of the top data disasters in recent IT, and one way he sees to avoid it:
What good is a security camera in the dark?
It’s not any good at all.
Without light (infra-red or otherwise), a security camera does nothing to help prevent or record theft, and the same goes for “Shadow IT.” When we don’t have data in the light and under surveillance, our ability to watch over it is drastically impaired.
Chief Security Officers and CIOs know that somewhere in their organization, a well-intentioned developer or business person is moving valuable data into the shadows by putting it in the cloud. This scares the “stuff” out of security minded executives because 2012 was another wild year of data (in)security around the world. How secure is your data? Do you know who has access to your sensitive data or where each and every copy of your data resides? Do you have a list of all the places corporate data lives in the cloud? If you don’t know, you are in the shadows. Continue reading →
Why IT Departments are Prioritizing Application Director
If you look at it through an extremely pragmatic, financially conscious lens, it’s not terribly difficult to imagine how things evolve as IT looks to prioritize improvements. Here is an example of how we have seen our customers thought processes unfold in discussions: Continue reading →
Memory is faster than disk. People realize that when they need to support high performance on-line applications. Recently many traditional database providers latched onto this and started “washing” their offerings with in-memory variations. At the same time, new companies are jumping into the In-Memory Data Grid (IMDG) space with unproven offerings. However, enterprise data is not something many are willing to experiment on.
VMware has virtually pioneered the IMDG, even before it was a category. Its vFabric GemFire team has been at this for a while now with a proven, production-grade offering called vFabric GemFire. In its latest release, vFabric GemFire 7.0 brings a couple of key enhancements for developers and IT pros alike:
Improving developer productivity
Increasing operational efficiencies
These improvements are in addition to the already proven data consistency and reliability that many have come to expect form vFabric GemFire in their scale-out data architectures. Once more, VMware has shown, both the technical knowhow and the necessary experience in enterprise-grade in-memory data to support on Cloud-scale. Continue reading →
In a nutshell, the vCloud Suite helps IT automate application deployments in a way that is very similar to how public cloud providers like Amazon’s are provisioned today—on demand by end users. The vCloud Suite helps IT organizations transform themselves into service providers, and provides them with the framework to ensure your cloud infrastructure is optimized, cost effective, secure and flexible enough to meet any business demands.
Register for Session OPS-CIM2646 – Cloud Application Platform Automation on vSphere Infrastructure Leveraging Application Director : Real-World Example of Running a 4 Billion-Dollar Business (VMware IT): Click Here
Register for Session APP-CAP2757 – Accelerate Adoption by Leveraging IaaS for a Complete Deployment and Monitoring Lifecycle: Click Here
Register for Session OPS-CIM2852 – Automated Provisioning for Business Critical Applications (Microsoft/Java) in Private or Public Cloud: Click Here
Follow all VMware AppMgmt updates at VMworld on Twitter: Click Here
Now that number is up to 90%. Here’s an overview of what the business workload lifecycle management implementation looks like under the hood.
As shared in the earlier post, our goal was to automate the end-to-end application life-cycle management in a private cloud and eventually across the clouds. Automation by definition speeds things up and makes them less error prone, but in this case, it also meant that VMware’s IT organization could decouple itself from the everyday operations of the app and product teams it serviced. This split between IT and DevOps is a goal for many organizations today who are looking to be more agile, save money and maintain strong IT governance.
To achieve it, VMware IT automated several key processes across organizations including:
It’s official. IT’s investment in the cloud is accelerating. Gartner recently reported that spending on public cloud services will reach $109 billion this year, up from $91 billion from last year’s spend. That’s an increase of over 20% in one year, and the fastest growing area of spend according to their predictions.
How is IT coping with such a dramatic shift in resources? At VMware, we are seeing an organizational shift that we are calling the Cloud Operating Model that is capitalizing on this effort. The Cloud Operating Model is both an organizational change and a technology evolution. On the organizational side, IT retrenches and focuses on building out a private cloud that is cost competitive to public clouds, provides end user services that attract apps to stay in-house, and can support a larger server-to-admin ratio. Application and business teams, presented with readily available infrastructure and armed with sophisticated app management and provisioning tools, transform themselves into DevOps—literally Development-Operations—that now have full control of application lifecycles including developing, running and managing their apps. While IT still provides services to DevOps, they actually become untangled from each other’s day-to-day operations.
Did you ever read the book “Who moved my cheese?” It was a 5-year New York Times Bestseller by Dr. Spencer Johnson.
The book speaks of how people react to change and offers several approaches to coping with change. The author, very eloquently, identifies the challenges of reacting and adapting to various changes in our lives.
In the book, Dr. Johnson helps us to understand that as we mature, we come to realize that change is constant: people change, schedules change, jobs change, friends change, and more. We are, for the most part, able to intellectually and emotionally cope and continue moving forward.
I can’t say that much about software. When something changes in software behavior, it will usually not have the capacity to cope and overcome, but remain in a fragile state until there is some human intervention.