Home > Blogs > VMware Accelerate Advisory Services > Tag Archives: public cloud

Tag Archives: public cloud

Managing Your Brand: Communications and Marketing for Today’s IT

By Alex Salicrup

Let’s talk about the subject in which every IT department lacks expertise — and that is how to effectively market your capabilities and communicate value. And readers may think I am exaggerating on my next statement: IT departments around the world are ubiquitous in that their consumers usually have a less than favorable opinion of them.

Of course, we know that this perception is not true in all cases. However, in my experience, IT does not do a good job at managing consumer perceptions. And in the IT service provider world, managing these perceptions is critical. Unlike yesteryear, IT service providers now have to compete with public cloud providers that manage their brand very well and educate prospects on how their capabilities map to consumer needs.

During my time at VMware, I’ve had the pleasure of working with industry-leading global entities. Many of their IT organizations claim that their consumers are not taking advantage of using external providers, only to find out that they actually are — and in a big way. Others have accepted the fact that competition exists, and that they must address it.

Many IT organizations have concluded that they must manage consumer perception of their capabilities and offerings. In other words, they are trying to figure out how to sell their brand and services internally. Most have no idea how to achieve that. That’s where I come in.

IT communications and marketing is not just building out an IT education campaign.  It’s making a significant change in how IT strategizes and changes its internal culture to think and act like a hungry service provider. IT begins looking at a service as though it were a puzzle, with consumer needs as pieces of the puzzle.

Let me share a few areas to consider as you begin to develop your communications and marketing strategy. I concentrate on eight areas when assembling a marketing and communications plan:

  1. Understand your audience
  2. Interpret consumer perceptions
  3. Define your brand
  4. Identify the catalyst for change
  5. Create your vision
  6. Who, how, and what to communicate
  7. Managing organizational change
  8. Brand perception metrics

Understanding Your Audience
In every organization there are three main levels of strategic and tactical execution, as shown below:Salicrup-Comms Mktg graphicExecution is different at each of the three levels. Individuals within each level listen to and address solutions based on their domain of responsibility, and they understand solutions only from the point of view of addressing the needs of their level. This in turn needs to be addressed with the appropriate message for each level.

Interpreting Customer Perceptions
Marketing campaigns are designed to create perceptions (we’re better than those other guys). Consumer perceptions are always our reality. Understanding consumer perceptions help us identify how to manage them, and, how to package a solution.

The problem with negative consumer perceptions about your IT organization or the service you provide is that those perceptions are hard to change. So how do you communicate to your consumers that your people and services are the best solution for their unique needs?

Defining Your Brand
Brand is synonymous to reputation but also aspiration. However, a positive brand, as with reputation, takes time to build and is easily tarnished. Service providers have a good awareness of their brand perception with their consumers. This allows the provider to shape a consistent message, improve credibility, and enhance its brand through advertising its goals and achievements.

Identifying the Catalyst for Change
Change is not easy. There are two groups within any business that have to experience change. The group most impacted is the IT group. They are transitioning from traditional IT delivery to a service provider model. Therefore the hardest task — the part takes the longest — will be converting the IT personnel. Identifying why change is necessary and “what’s in it for you” can motivate your staff to follow your vision.

Creating a  Strong Vision
The critical aspect of a successful service communication strategy is the clear articulation of the vision.

Your vision must:

  • Be strategically feasible
  • Be effective
  • Incorporate the current position of the enterprise and catalyst(s) of change
  • Be ambitious
  • Be evidently accomplishable

Managing Organizational Change
No one is really happy about change. Turning your organization from traditional IT or project-based consumption to a service-based consumption model will incur role and cultural changes. The former is easier than the latter, and it needs strong leadership to guide it there. Furthermore, IT is changing the way that the business deals with IT. This is why organizational change management is so important. It is not just a operating change, it’s a massive behavioral change that people need to be guided through. If this is done crudely it will impact the brand severely and cast doubt about IT’s capabilities.

Effective communications are key — it’s very important that IT staff understand the unified message. They should become active ambassadors of the IT brand and the services the team provides. Communication, in this sense, refers to the art of persuasion. Crafting a message that is persuasive is a learned skill and essential if a perception is to be changed successfully.

In order to be persuasive, the IT team really needs to learn how their consumers think, and, predict what consumer reaction will be to events and solutions. People who are good at persuasion develop a keen sense of what solutions work and how messages need to be successfully crafted. This is paramount for any emerging service provider. Communication is about knowing what influences decisions at the three levels illustrated in the figures above. Therefore, different messages need to be crafted to persuade the different levels.

However, one of the highest risks a service provider has is individuals within IT not believing in the solution, the need for it, or how it’s being delivered. These individuals that are skeptical pose a threat of creating doubt within the consumers of the solution and its merits or capabilities.

A critical and difficult aspect of change for the IT staff is the understanding, adaptation, and dissemination of the vision and how they choose to communicate it. It is essential that leaders understand the dynamics of their teams, customers, and stakeholders. Understanding how to communicate and use your team to promote your brand and vision is important to your success. (Stay tuned for a future post, where I will talk more about individual motives and capabilities and how they can be mapped to three distinctive groups…)

Measuring Success — Brand Perceptions Metrics
It is imperative that an IT organization gauges how its consumers feel about the services they’re consuming from the service provider. The IT team needs to put in place metrics that capture performance against the needs of the customer and set realistic targets on what is to be measured.

This does not have to be complex — a simple 5-question survey is a great way to start. If the response is mainly positive, the IT team can include that message to its consumers to reinforce the positive perceptions. If the response highlights challenges, it’s a great way for the IT team to focus energy on fixing them — a catalyst for change.

In conclusion, I have covered steps and actions in this post that are fairly simple — perhaps perceived as common sense. However, IT traditionally does not have these communication and marketing skillsets. And, the IT organization has not needed them before the advent of public cloud — but they are needed now.

—-
Alex Salicrup is a transformation strategist with VMware Accelerate Advisory Services and is based in California.

IT, It’s Time to Change Your Relationship with Change

Author: Reid Engstrom

Many IT organizations are reluctant to drive virtualization to high levels in their compute environment, while casting a speculative eye toward the software-defined data center (SDDC), which expands virtualization to include networking, storage, and management platforms. This reticence is hardly surprising since these advances require changing the data center environment, and change has always introduced the possibility of a technical fault widespread enough to affect business operations.

Though IT continues to build stronger change management systems and to test environments to mitigate risk, the IT organization in general is still reluctant to consider major or rapid change.

That may have to change as cloud providers put increasing pressure on IT organizations. Whereas IT formerly had a monopoly on services for lines of business, they now have to compete with public cloud services, like Amazon Web Services (AWS), that include fast provisioning, known costs, and a higher level of end-user control.  Plus, these services are not controlled or limited by restrictive IT processes.

This competitive landscape is forcing IT to rethink its resistance to change. There is a growing sense of urgency for the IT organization to become more agile, more transparent on service costs, and more collaborative with business stakeholders.

The software-defined data center extends the virtualization concepts you know—abstraction, pooling, and automation—to all data center resources and services. In conjunction with a sophisticated virtualized and automated environment, the SDDC also provides usage cost transparency that will keep your IT organization well ahead of the third-party service providers nipping at its heels.

Watch this short video to learn more about the results that Columbia Sportswear IT has achieved with a VMware software-defined data center.

========

Reid Engstrom is a VMware Accelerate Advisory Services strategist emeritus.

VMware AccelerateTM Advisory Services can help you and your key stakeholders understand the IT-as-a-Service value proposition—our consultants quantify the potential benefits, develop architectural designs, recommend organizational and process changes, create a migration plan and advise during implementation. Visit our Web site to learn more about our offerings, or reach out to us today at: accelerate@vmware.com for more information.

Would you like to continue this conversation with your C-level executive peers? Join our exclusive CxO Corner Facebook page for access to hundreds of verified CxOs sharing ideas around IT Transformation right now by going to CxO Corner and clicking “ask to join group.”

VMware Cloud Compass Tool (powered by Alinean)

Author: Thomas Pisello, Alinean CEO and founder

It can be a challenge knowing which cloud solution is best for your particular workloads and business requirements: a private cloud, public cloud or hybrid solution?

To help you determine the best option, VMware worked with the business value experts at Alinean to create the VMware Cloud Compass Tool.

The VMware Cloud Compass Tool factors your unique workloads, budget goals, risk tolerance and desired business outcomes to provide a customized 3rd party recommendation as to the best cloud option for your unique requirements. The tool factors the most important elements to help guide your cloud decision, all in less than 10 minutes to complete.

Starting with a few simple questions about your company and workload requirements, the tool then provides:

  • A comparison of total cost of ownership (TCO) for various compute options, differentiating the costs for on-premise with public and private cloud options, tallying differences in CapEx, OpEx and business benefits.
  • An assessment of your Risk Tolerance, analyzing the importance of Availability, Governance and Compliance, Security and Privacy and Business Relationship Management in your selection of the right cloud platform.
  • An assessment of Results Expectations, determining how important Accessibility, Business Responsiveness, Scalability and Cost & Accounting is to the cloud decision.

Based on the workloads, TCO, risk and results assessments the tool delivers an online summary of the recommendation results; with an overview of the right cloud recommendation based on your unique factors and requirements.

For a more detailed view, a complimentary customized white paper can be downloaded and shared with your team, personalized for your specific workloads, budget, risk tolerance, desired business outcomes, and most importantly, cloud recommendations.

For a quick introduction to the VMware Cloud Compass, watch the short video below with VMware Accelerate Advisory Services Benchmark Practice lead Craig Stanley, and you can read Craig’s recent blog, The 3 Rules for Making Confident IT Decisions, for a deep dive on VMware’s risk analysis methodology.

Access the VMware Cloud Compass to determine the best cloud solution for your workload and business requirements.

===

Craig Stanley is the Benchmarking Practice Lead for VMware Accelerate Advisory Services. You can follow him @benchmarkguru and Thomas Pisello @tpisello on Twitter.

If you’re at VMworld San Francisco today, stop by the VMware Accelerate Advisory Services demo booth in the Solution Exchange, and meet Craig in person!

 

The 3 Rules for Making Confident IT Decisions

Author: Craig Stanley

When presented with a choice between two solutions with an obvious difference in cost and value, you should always choose the cheaper one, right?  We all know that’s not the case, as it’s just not that simple. In fact, many times the more expensive choice may be the right one when all factors are considered. But it’s important that the cost premium delivers a value that exceeds the cost differential and potential for failure.

The other intangible factors that influence decisions are what can be generalized as “risk.” The major components of risk are: risk exposure, risk tolerance, confidence and trust, probability and chance, and the size of the risk or decision. Counterbalancing risk is return, which is comprised of the same factors, but refers to the ability to achieve value goals. A robust risk analysis establishes a framework for identifying, measuring, evaluating, and objectively comparing these factors.

VMware’s process to analyze risk identifies specific areas of risk, assesses your reaction to the potential for problems to occur and risk tolerance, and computes an inherent risk/return factor that can be applied to the total cost of ownership (TCO). This process is used to create a risk-adjusted TCO by increasing or decreasing the benefit with respect to the perceived risk.

As an IT decision maker, your response to risk is an emotional reaction that influences your decision and even your ability to make a decision. When deciding between two options, the decision you make that will likely deliver the most favorable outcome adheres to three general rules:

  1. The ratio of the investment to the expected return influences the decision between financial risk and performance risk.
  2. The level of risk tolerance should exceed the level of risk exposure.
  3. The upside value potential should exceed the value being put at risk.

In the first rule, your emotional connection to financial and performance risk is evaluated. First, you have consider how the decision will impact you or your organization if the decision turns out to be a bad one.

  • What if this doesn’t turn out as expected?
  • What if it ends up costing more and taking longer to implement?
  • Am I getting locked into something I’ll have trouble getting out of?

As the uncertainty of these types of concerns increase, the likelihood of your decision stalling will increase as well, because it may appear that doing nothing is less risky. But making no decision carries risk exposure as well in terms of lost opportunities and unmitigated risk exposure. This type of risk can be categorized as performance risk as it is associated with the success and probability of failure in the competing solutions.

And, the size of the decision’s cost and the potential revenue or value being put at risk also makes your decision more of an emotional one. This type of risk can be categorized as financial risk, being associated with the ratio of the investment to the outcome. For example, the game of poker is basically the same whether you’re playing a friendly game for pennies or playing with $1,000 chips in Las Vegas. But you play the game very differently when the stakes of losing are significantly higher and, consequently you are less willing to take chances.

If you were presented with an opportunity to make a sizeable return on an investment, but the amount you needed to investment was large, then you might not be inclined to accept the opportunity without much consideration. But if the same situation was presented and you only had to make a very small investment, then you might accept the opportunity immediately.

The second rule of the decision process is that the level of risk that is acceptable to you should be greater than the level of risk you’re being exposed to. Analyzing these risk factors involves:

  • Identifying the most comment incident events that might occur
  • Determining how each event would impact your decision
  • Determining how much risk you can tolerate for each event
  • Evaluating the probability of the event occurrence in each of the decision choices

These risk factors are evaluated to arrive at a risk exposure and risk tolerance value for each solution. The gap between the tolerance and exposure is termed “inherent risk.” If this result is negative, then the inherent risk of your decision is unfavorable and indicates that there may be unmitigated risk in the decision since the exposure is greater than what you are willing to accept. Conversely, if the gap is positive, then inherent risk is favorable and suggests opportunity for you to assume some additional risk to gain additional value opportunities. The inherent risk can be applied to the decision investment to create a risk-adjusted investment value.

The third rule of the decision process is that the upside potential should value be placed at risk. The upside potential is based on the value differential between the solutions. The value being placed at risk, or downside, examines the potential losses that could be incurred within the context of the rated risks. Ideally, the former should be greater than the latter.

For example, let’s assume you can make $1,000 performing some task, but if anything goes wrong, you’re out $100,000. Would you take that risk?  Probably not, since there’s just not enough profit in that scenario to assume a 100:1 risk, unless you have extreme confidence that you have effectively removed the potential for failure.

These three rules describe results that can be integrated into an overall decision framework that produces a risk-adjusted investment or TCO in an IT decision; a return on risk; and an estimation of value impact.

The risk adjustment is a function of the inherent risk and the investment. When I’m working with IT decision makers, we compare the inherent risk of the decision that’s being evaluated with the competing TCO values to determine a mitigation-versus-value opportunity offset. This offset is applied to the TCO to arrive at a risk adjusted TCO, or a TCO that reflects the impact of the inherent risk. The risk adjusted TCO will reflect an increase or decrease depending on the inherent risk factors.

We can determine the return on risk by the ratio of the upside opportunity to the downside exposure as a function of the inherent risk and the investment ratio. If, as described in the first rule above, the financial risk of your decision is very small, then your return on risk may be largely driven by the inherent risk factors. Otherwise, a large financial risk tends to take precedence over the inherent risk. A positive return on risk suggests the potential for success in your decision is good, while a negative suggests a higher likelihood of failure.

Lastly, we can estimate the overall impact on the value stream by factoring the investment or TCO adjustments within the context of the investment ratio. This result will estimate the potential revenue or budgetary impact you may see of your decision with competing or comparison solutions.

Because the risk analysis process reveals both risk and opportunity, these three results enable you to make a more confident decision. Measuring your emotions and beliefs about one solution versus another helps your decision-making process and removes the fog of uncertainty.

The Accelerate risk analysis methodology described here is straight-forward to use and delivers results that are relevant, accurate, and easy to understand. The results are provided in a format that can be readily shared throughout the enterprise as needed. Applying risk analysis to the public/hybrid cloud decision process and other major IT initiatives will help you gain insight into the risk factors involved for each alternative, quantify the real value of the risk and opportunity, and increase your confidence in the decision.

Learn more in our white paper How Risk Analysis Streamlines Decision Making for Major IT Initiatives

=====
Craig Stanley is the Benchmarking Practice Lead for VMware Accelerate Advisory Services. You can follow him on Twitter @benchmarkguru.

If you’re at VMworld San Francisco tomorrow, stop by the VMware Accelerate Advisory Services demo booth in the Solution Exchange, and meet Craig in person!

Want to continue the conversation with your C-level executive peers?  Join our exclusive CxO Corner Facebook page for access to hundreds of verified CxOs sharing ideas around IT Transformation right now by going to CxO Corner and clicking “ask to join group.”

Getting Started on Your Journey: On-Demand Services

Author: Michael Francis

IT organizations are experiencing the change from monopoly holder of IT service delivery to being a supplier of many IT services to the business. And as a supplier of services, IT needs to engage with the business with competitive differentiation.

As CIO, you know that your business customers will migrate to paths of least resistance to achieve their desired business outcomes—many times this means engaging public cloud services. The risk to your IT organization and the enterprise is a lack of governance and the risk of increased cost. Inevitably this approach, which initially may result in agility for the business customer, can quickly become detrimental to achieving broader business initiatives for the enterprise.

I see a second bump on the hype curve of public cloud in my region—Australia. Many of my customers are planning the relocation of assets to the public cloud, with the assumption that it must be more cost-effective. But as the CIO, it’s critical to understand what you’re gaining from relocation and why. Analyzing what drives costs in your private cloud and why these will be reduced in the public cloud is a critical element for success.

Public cloud does play a role in extending enterprise IT capabilities—everything from SaaS-based applications to IaaS-platforms—and today’s CIO needs a strong financial and capability-centric business case to act as a service broker. Transforming your IT organization to deliver more value to the business requires a focus on enabling new business capabilities in a more agile way—a focus on the business needs rather than on a commodity that may or may not be more cost effectively delivered by a third-party supplier.

To successfully transition to an IT broker, the transformation of people, process, and technology must begin with the CIO. Without the support of the CIO, any change will likely be incomplete and focus on only one axis of the people, process, and technology composition.

The next step in architecting an IT broker environment is to envision what the environment may look like in the future and why it will look this way. The image below depicts a progressive, mature enterprise IT architecture. IT services are delivered to the presentation layer and consumed through the brokerage layer implemented in the corporate services cloud. This enables the consumer to access a service with an SLA-based agreement between the consumer and IT. The brokerage layer located in the corporate services cloud is responsible for vendor management and SLA reporting.

Once the CIO and enterprise architects share a common vision for the future, the next step is to identify the existing business services offered by the IT organization and their common SLA requirements. This commonality forms the basis to define the initial IT service offerings. Initially this process would only consider infrastructure-level SLA requirements such as availability, recovery, performance, and security. Start with the infrastructure layer, as it tends to have a high degree of commonality and minimal differentiation. It is also the foundation for more advanced IT service offerings like PaaS and SaaS.

To recap, as CIO you will maximize the probability of success at the onset of your journey to becoming an IT service broker by having a shared vision with your enterprise architects—I can’t overemphasize the importance of a clear and shared understanding of the desired future state. The next step is to implement a brokerage capability—the combination of people, process, and technology operations. Lastly, identify the common SLAs defined for existing business services—these will form the first IT service offerings.

On-demand services can provide the efficiency and agility needed to transform your IT organization from reactive provider to engaged service broker, and finally, to a strategic partner driving the goals of the business.

============

Michael Francis is a principal systems engineer at VMware, based in Brisbane.

Would you like to continue this conversation with your C-level executive peers? Join our exclusive CxO Corner Facebook page for access to hundreds of verified CxOs sharing ideas around IT Transformation right now by going to CxO Corner and clicking “ask to join group.”