Home > Blogs > VMware vCloud Blog > Monthly Archives: April 2012

Monthly Archives: April 2012

5 Key Features of vCloud Integration Manager

By: David Davis

The adoption of VMware vCloud services by service providers (and customers) is growing. In fact, there has been a 3x increase just since Q3 2011, with over 100 VMware service provider partners now offering VMware vCloud Powered services. With more signing up every day, service providers want to get vSphere, vCenter, vCloud Director, vShield, and Chargeback up and running as quickly as possible. The faster that they can get these pieces up and configured, the faster they can start providing cloud infrastructure to customers. 

DdvCIM

Up until now, getting all of these pieces installed, configured, and integrated could take some time. Once the infrastructure was ready, service providers then had to find ways to automate the delivery of customer and reseller cloud services. Fortunately, VMware has a new solution that eliminates these obstacles and gives service providers a shortcut to profitability and customers a shortcut to cloud computing. That solution is the new VMware vCloud Integration Manager, released February 7, 2012.

Here are 5 things that you need to know about vCloud Integration Manager:

  1. For Service Providers – vCIM isn’t for customers or for most VMware admins, it’s for providers of vCloud Powered services who are part of the VMware Service Provider Program (VSPP).
  2. Important for All Cloud Users – While vCIM isn’t for the typical VMware admin, it’s still important for all VMware admins because it allows vCloud service providers to get up and running faster, be more efficient, and offer cloud services to you (and me), faster than ever.
  3. Built on vSphere and vCloud Director – vCloud Integration Manager doesn’t work with other hypervisors or cloud solutions, it’s just for vSphere and vCloud Director Infrastructures. It allows them to automate the delivery of infrastructure cloud services and the total lifecycle of vCloud users.
  4. Accelerate and Simplify – vCIM helps to accelerate the provisioning of vSphere, vShield, vCenter, vCloud Director, and Chargeback while offering a web-based administration portal. It offers a REST-based API for integration with the service provider’s back office systems like CRM and billing. On the operations side, vCloud Integration Manager will increase efficiency and reduce costs by streamlining the customer lifecycle and reseller management.
  5. Resources to Learn More About vCloud Integration Manager are Available – you can learn more about vCIM at:

For more information on vCloud Director, see my posts:

And for future updates, be sure to follow @vCloud and @VMwareSP on Twitter. 

David Davis is a VMware Evangelist and vSphere Video Training Author for TrainSignal. He has achieved CCIE, VCP, VCAP-DCA, and vExpert level status over his 18+ years in the IT industry. David has authored hundreds of articles on the Internet and over 10 video training courses for TrainSignal.com including the popular vSphere video training package. Learn more about David at his blog or on Twitter and check out a sample of his new vSphere 5 video training course over at TrainSignal.com.

The Public Cloud Security Advantage Part III – Stories from Tier 3 Customers

Screen shot 2012-04-10 at 5.24.26 PM

In Parts I and II of this blog series, we’ve explored how companies with strict security needs and compliance requirements have successfully moved to the public cloud with the help of VMware Service Providers, Bluelock and iland. In our final installment in this series, we’ll take a look at two Tier 3 customers and the benefits they’ve received, such as improved availability and cost savings, by making the move to the public cloud.

Financial Transaction Company Utilizes Tier 3 for Secure Stock Trades and Capital Exchanges

The first company we’ll highlight provides secure IT architecture and software for stock trades and capital exchanges for some of the largest financial institutions and investment banks across the Eastern United State 

The company used to operate three datacenters on their own, but according to the company’s CTO, “When we were running our own datacenters, it was a full time job just to evaluate and install all the required security patches…We would have had to hire ten times the number of people we have now to meet the security regulations and expectations of our Fortune 100 clients.”

In addition, a series of natural disasters that caused service interruptions for the company and its clients, as well as rising security requirements and regulations, prompted the company to look at a secure public cloud solution. 

Now the company is delivering high security and availability at a reduced cost with a public cloud SaaS solution at Tier 3. With their cloud deployment they’ve had zero interruptions, despite the blizzard, hurricane, and several tropical storms that have knocked out their power this year alone. 

In sum, the company’s CTO shares, “The cost to continue running our own datacenters with required security compliance and reporting would have been seven times the cost spanning a three year timeframe over what our Tier 3 cloud solution is costing.”

Legal Discovery Company Delivers Secure Services and Development Platform on Tier 3 Infrastructure

The second company we’ll discuss is a new company that produces software to support law firms and legal departments inside of large enterprises. In order to stay competitive, the company focused on delivering superior products with faster time to market and outstanding customer service. With their cloud-based infrastructure through Tier 3, this company has been able to:

  • Increase the speed of development, testing, and resolving of customer issues
  • Create an evaluation environment for new clients that can be spun up or down almost instantly
  • Offer their solution to customers in a SaaS model 

According to a senior operations analyst at the company, “Using Tier 3 provides a huge ROI in so many ways. We have numbers to prove it is three to four times less expensive than building and securing it yourself.”

We hope that this blog series has shed light on the fact that the public cloud can be secure, and many organizations today, even those with strict regulations and compliance requirements, are already taking advantage of the public cloud with the help of VMware and our ecosystem of Service Provider partners. 

Download the Public Cloud Security Advantage Whitepaper for more information on companies highlighted in this blog series. For more stories from the VMware Service Provider and Partner Network, be sure to check out the Global Alliances Blog, and follow @vCloud and @VMwareSP for future updates.

Another VMware Cloud: SEGA Europe Runs Their Hybrid Cloud on VMware

Through Colt and VMware’s Hybrid Cloud Activate service, SEGA Europe was able to reduce the time needed in game-testing implementations by 17%, resulting in more efficient, bug-free games.

SEGA Europe, the European Distribution arm of Tokyo, Japan-based SEGA Corporation and maker of popular games such as Sonic the Hedgehog and Super Monkey Ball, is a digital publisher of interactive entertainment software products. With thousands of users and game testers across Europe, IT agility is essential to the company in order to make the game development process as efficient as possible. 

SEGA Europe is a highly virtualized company, with over 80% of their infrastructure virtualized with VMware. Because of this, it was important to SEGA when choosing a cloud service provider that the public cloud solution was compatible with their existing VMware-based virtualization solution.  It was for this reason that SEGA decided to work with vCloud Datacenter partner, Colt.

According to Francis Hart, Systems Architect at SEGA Europe, SEGA traditionally hosted a lot of its infrastructure for online services within a co-located datacenter, which was very rigid and had a lot of upfront costs. The company started looking to the cloud in order to improve their IT agility and improve the speed in which they could provide services to their customers. 

 

Through the VMware hybrid cloud model, SEGA Europe was able to achieve the following benefits:

  • Ability to leverage multiple ISPs;
  • Game testing codes became available on-demand to game testers, cutting down the amount of time needed during game-testing implementations;
  •  Ability to use a hosted version of the same service and supply it to trusted testing studios around Europe and the world through Colt’s Virtual Cloud Director (VCD) platform. 

According to Hart, the VMware hybrid cloud model has been a massive success for the company, with all of the UK testing studios using the build delivery system on a daily basis. The company is now looking at using Colt and VMware to move most of their online game environments into the cloud.

For more on SEGA’s hybrid cloud deployment, check out Dana Gardner’s podcast interview with Francis Hart. Visit Another VMware Cloud to learn more about VMware’s Hybrid Cloud Activate service and other companies who have successfully deployed a public or hybrid cloud model through VMware, and be sure to follow us on Twitter at @vCloud and @VMwareSP for more Another VMware Cloud stories! 

The Public Cloud Security Advantage Part II – Stories From iland Customers

Screen shot 2012-04-10 at 5.24.15 PM

Is security in the public cloud possible? In Part I of this blog series we highlighted the experiences of two Bluelock customers and how their move to the public cloud was actually motivated by the need to meet strict regulations and compliance requirements. By taking advantage of Bluelock’s numerous security accreditations and strong physical security, Bluelock’s customers were able to achieve improved security and significant savings in their move to the public cloud. 

In Part II of this blog series, we’ll review the experiences of two iland customers and the benefits they received by moving to the public cloud with iland. iland Internet Solutions provides hosted cloud infrastructure services in North America and Europe that enable customers to leverage enterprise class infrastructure in the form of virtual datacenters with flexible billing and capacity models.

GxPi – Creator of Electronic Document Management Solutions for Life Sciences Meets Regulatory Requirments with iland

GxPi is a ten-year veteran working in Good Practice (GxP) compliance, and provides products and services that simplify compliance with the regulatory requirements within the Life Sciences sector. The company is headquartered in the UK and therefore must comply with specific GxP requirements and regulations. 

Of the company’s decision to utilize a cloud infrastructure as a service (IaaS), the managing director stated, “[Security and operations] is not our core business…I know some cloud providers can do this job better than we can, this is what they focus 100% of their energy on.”  

The managing director then shared that by moving GxPi’s applications to the cloud, they were able to:

  • Increase focus on their core business;
  • Achieve a superior IT solutions and cost savings;
  • Take advantage of iland’s reporting and audit capabilities 

According to the managing director, “There are significant savings from operations. Just the capital investments I used to make are equivalent to what I pay for my cloud solution, and that covers everything.” 

eMix – Medical Information Exchange Provider Uses iland to Deliver Secure Medical Information

eMix is a company that provides medical information exchange across several locations and between hospitals, healthcare institutions, physicians and patients. This sensitive information includes medical scans, pictures, x-rays, and other medical documents. Companies operating within the health care industry are under heavy regulatory oversight, so security is a major concern for eMix.

After attempting to operate their own datacenter for six months, it became clear to the company that it was not the best solution. eMix’s general manager (GM) commented, “CIOs have matured from erroneously thinking that just because the data was in their datacenter it was more secure. The fact is that the security effectiveness is based on the implemented policy and strategy with careful tracking and auditing, not where it is or whose equipment it is on.”

The company then decided to adopt a cloud solution in order to: 

  • Increase the reliability, availability and scalability of its business
  • Take advantage of expert cloud staff with round-the-clock support and response 

Because eMix already runs their private cloud on VMware technology, they required a provider that offered a VMware vCloud public cloud solution. iland was selected based on its strong partnership with VMware, solid customer focus and responsiveness.

According to eMix’s GM, the development of enterprise-class cloud services and the security capabilities of those services now make the public cloud not only a viable, but also an appealing alternative.

For more information, download the Public Cloud Security Advantage Whitepaper, and stay tuned for Part III of this series, where we’ll be diving into the experiences of two Tier 3 customers in their move to the public cloud. For more stories from the VMware Service Provider and Partner Network, be sure to check out the Global Alliances Blog, and follow @vCloud and @VMwareSP for future updates.

Big Data in the Cloud – Highlights from #cloudtalk

For this month’s #cloudtalk, we invited the cloud community to discuss big data in the cloud, and share the advantages and challenges of managing big data. In case you missed it, here are some highlights from Tuesday’s chat.

To start things off, we asked participants how they saw the cloud affecting big data in the coming years.

VMware Service Provider @SunGardAS believed that cloud will be the standard delivery method to consume big data, similar to all other IT services for an enterprise, which @ryanprociuk agreed with, adding that the cloud will be the most efficient and effective method to understand how to take advantage of big data. @Tier3, a provider of vCloud Powered services, shared that the cloud allows the flexibility to scale and process big data efficiently, without high upfront costs.

@jayfry3 suggested that for folks trying to understand the IT infrastructure impact of big data, using cloud seems like a smart way to test the waters, which @SunGardAS agreed with, adding that education is key for the adoption of big data – IT needs to understand the power of big data and how they can leverage it.

We then asked participants what they thought the main challenges were to managing big data in the cloud.

@michellecraven responded that security is still a major concern to running big data in the cloud, while others, such as @wsellers, @Tier3, and @valb00, agreed that training, openness to change, and IT inertia are all challenges to managing big data.

@antonvirtual pointed out that the biggest challenge to managing big data is actually the fact that data is BIG and involves the management of hundreds of terabytes. @ryanprociuk thought that the challenge of big data starts with defining and understanding what big data is best managed in the cloud.

Following this, we asked participants what they thought the business advantages were to managing big data in the cloud. 

@antonvirtual stated that one of the biggest advantages is the cloud – you don’t need to invest in infrastructure to store and process big data, which both @Tier3 and @wsellers agreed with. @ryanprociuk added that with the cloud, the focus can be on defining the integration services to collect, process, and store data sources. 

@SunGardAS shared three advantages of managing big data in the cloud: 1) it allows business agility through flexible (scale up and down) capacity; 2) it eliminates the headache of managing and optimizing complex IT infrastructure; and 3) it allows data analysts to interact directly with IT infrastructure consumed for big data analytics – points that @Graham_Irving and @sarabchugh agreed with.

Participants then shared how big data has affected their organization’s products and cloud service offerings.

@ryanprociuk said that big data has enabled the transparency of data, and that they’ve created new data offerings based on increased usage. He later added that the cloud has made it effective to open big data analytical insights to both partners and customers.

@sarabchugh shared that it has allowed his organization, as well as others, to mine the wealth of machine generated data and produce useful insights for their customers. @valb00 said that big data in the cloud has enabled NetApp to process 1 trillion telemetry records.

When asked if they saw a trend among customers becoming more interested in running big data in the cloud, many participants chimed in that they did. 

@sarabchugh said that he’s seen more customers interested in running big data in the cloud, and not just for test/dev use cases. @ryanprociuk added that he’s seen a trend in organizations shifting intensive analytical workloads into the cloud. @SunGardAS stated that for many of their customers, managing and maintaining large-scale IT infrastructure is not their core business focus, so customers are turning to the cloud for business agility, big data analytics, running tier 1 and tier 2 apps, and disaster recovery.

Thanks again to everyone who participated in this month’s #cloudtalk! Stay tuned for details on our next #cloudtalk, and be sure to follow @vCloud and @VMwareSP for future updates. 

It’s All About Prince Charming, Not The Ugly Sisters

By Mathew Lodge

Last week, cloud watchers were treated to the spectacle of a dust-up between IaaS software camps jockeying to be the fairest cloud vendor of them all. Like a bad remake of a Brothers Grimm fairytale, it seemed like the ugly sisters had gotten the message from the magic mirror that they were not the fairest in the land, and had decided to brew some potions, wave a few wands and declare themselves most attractive cloud IaaS platform.

That’s not to say that VMware doesn’t like a good competitive fight, but the golden rule is to remember that the battle is for the privilege of serving customers. Whoever successfully rides off into the sunset with Prince Charming – the satisfied customer – lives happily ever after.

Last week, while the ugly sisters were squabbling, customers were getting on with business and choosing their Cinderella as VMware quietly passed the 100 vCloud service provider mark. There are now more than 100 verified VMware vCloud public clouds, which is an order of magnitude greater than the ugly sisters’ combined total. You can now get a vCloud in 24 countries, effectively forming the world’s largest community of compatible public clouds. By compatible, I mean where it really matters to customers and their applications: the same VMware cloud infrastructure 350,000 customers trust to run their most demanding datacenter applications, the vCloud API, and the OVF open file format for workload and (perhaps more importantly) data interchange between clouds. 

So what does this mean for customers? The oldest University in the English-speaking world, Oxford, can securely manage and share databases with over 40,000 researchers using hybrid vClouds. Sega, a worldwide leader in interactive entertainment, can standardize its dynamically scalable test and develop environment across private and public vCloud infrastructure. You can hear from both on the “Another VMware Cloud” site.

The growth of the vCloud ecosystem tells the Cinderella story: 18 months ago, the first five vCloud service provider partners launched services based on the vCloud Director 1.0 software release. Many of those partners are now leaders in the Gartner IaaS magic quadrant (available for free from Bluelock, one of those leaders, here). 7 of Network World’s “10 most powerful IaaS companies” offer VMware-based IaaS clouds (not including VMware itself, company #10 on this list).

In 2011, VMware’s service provider business overall grew in excess of 200% year-on-year. This is a direct measure of customer adoption since VMware operates a cloudy pay-as-you-go licensing program for service providers. We only get paid when service providers use our software to deliver services to Prince Charmings the world over. Hear leaders from the University of Oxford and Sega talk about what they’re doing with hybrid (private + public) deployments.

Grimm’s Fairytales conclude with more than the heroes riding off into the sunset to live happily ever after. When Cinderella and the Prince are reunited, for example, the ugly sisters see the writing on the wall, attempt to ingratiate themselves with the couple and are rather brutally punished (in the original 1812 book, not the Disney version!) In the end, the Prince always figures out the true state of affairs, seeing through the subterfuge, noise and pretense.

Changing IP Address on vCenter With SRM

By: Kris Boyd, Technical Solutions Architect with VMware, and Chris Colotti, Consulting Architect, VMware Global Center of Excellence

This is a repost from Chris' personal blog, ChrisColotti.us

As part of the Disaster Recovery solution that Duncan Epping and Chris Colotti developed, Kris Boyd has been working on a comparative VMware View Solution using the same basic principles.  However, one thing both solutions came across is a unique situation on the vCenter virtual machines when trying to use Site Recovery Manager’s ability to change the IP Addresses on the Guest OS in the recovery site.  This situation only seems to affect the vCenter virtual machines and is a very specific condition.  It is unique because in most cases Site Recovery Manager is talking to an upper layer vCenter Server, and in our Disaster Recovery solutions there is also a vCenter Server being managed by another vCenter and SRM as depicted below.

Fig1

Most of us know that SRM has the capability to change the IP on a virtual machine that is part of a recovery plan, and in most cases this works just fine.  There are a few situations, such as Virtual Appliances, that have some issues due to the version of VMware Tools.  Most recently we have seen a specific issue with a vCenter Server (non-appliance) virtual machine as we are doing in both DR solutions.  We wanted to take a few moments to describe the condition and the high level work around that is required to deal with the vCenter Server virtual machine.

Note:  This is not meant to provide the exact steps, but is intended to highlight the areas of consideration.  It also only applies when the recovery site is using a different IP range and addresses must be changed during the failover process.

As you can see from the diagram above, the entirety of infrastructure components were VM’s that were being failed over to the recovery site as managed by Site Recovery Manager.  Since vCenter was one of these machines, we tried, (unsuccessfully), to re-IP that VM along with all of the other servers with static IP’s using the feature in SRM.  As it turns out not only does SRM not re-IP the machine, but the SRM recovery test will fail if you tell it to re-IP vCenter.

What we learned when we dug into this is the real reason why this was failing.  This particular issue has to do with the vCenter services not successfully starting while waiting for the network to come up.  SRM needs the VMware tools running in order to change the IP address on a virtual machine.  in the case of vCenter Server, the vCenter services try to start before the tools are running.  When the vCenter service starts and cannot connect to the database, (due to the fact there is a different network), the service start just hangs.  The Guest sits at “Applying Computer Settings” indefinitely.  This prevents the VMware Tools from even starting, thus preventing SRM from changing the IP address.  It’s pretty much a rock and a hard-place situation.   SRM was successful in its attempt to re-IP all of the other servers, but vCenter required a set of manual steps to accomplish the same task.

These manual tasks also had their own set of challenges, because if you do things wrong you may notice that windows will never let you login to this server again for the same reason.  Without going into the gory details, here is what you need to take away from this:

  1. Never assume anything when testing a disaster recovery plan
  2. vCenter must have its IP updated manually during a fail over
  3. Since other services are dependent on vCenter, you should include a wait step in the SRM recovery process to give you time to update the vCenter IP.

Chris is a Consulting Architect with the VMware vCloud Delivery Services team with over 10 years of experience working with IT hardware and software solutions. He holds a Bachelor of Science Degree in Information Systems from the Daniel Webster College. Prior to VMware he served a Fortune 1000 company in southern NH as a Systems Architect/Administrator, architecting VMware solutions to support new application deployments. At VMware, in the roles of a Consultant and now Consulting Architect, Chris has guided partners as well as customers in establishing a VMware practice and consulted on multiple customer projects ranging from datacenter migrations to long-term residency architecture support. Currently, Chris is working on the newest VMware vCloud solutions and architectures for enterprise-wide private cloud deployments.

The Public Cloud Security Advantage Part I – Stories From Bluelock Customers

Screen shot 2012-04-10 at 5.24.00 PM

While security is still considered a large inhibitor to the adoption of public cloud, it has also become one of the top reasons for some companies to select a public cloud. This is because with increasing regulations and compliance requirements, many companies that need state-of-the-art security infrastructure, software and personnel have found that an enterprise-class public cloud solution is the best option for their computing needs. 

In Part I of this blog series, we’ll highlight the experiences of two Bluelock customers in industries burdened by strict security requirements, whose move to the cloud was motivated by the desire to receive better security than they could provide themselves.

Financial Software Company Uses Bluelock to Deliver Secure Cloud Solution

The first company we’ll highlight is a financial company that produces software to manage trusts and provides investment management software for governments, banks, IRAs and law firms.  According to the company’s president and chief operating officer (COO), the company wanted to move its operations to the cloud in order to achieve these benefits:

  • Secure and reliable platform to provide a SaaS offering

  • Improved security standing
  • Instant scalability to meet the demands of existing clients
  • Flexibility to capture new customers
  • Reduced costs
  • Minimize IT operational demands

While many questioned whether a public cloud solution was the most secure option for a financial company, both the president and COO agreed that between Bluelock’s finger print security and man traps at their facility, “Bluelock’s private cloud has more security than we have at our own datacenter.”  They also added that though the cloud hosting costs are what their hardware costs would have been to build out, they are saving on personnel, operations, and reporting, and in sum, are “getting a better and more secure solution for much less.” 

Provider of Smart Meters Meets Strict Utility Security Requirements with Bluelock

The second company we’ll highlight is a software company that provides software meters to utility companies. The company’s operations director explained that their initial motivator to the cloud was to provide infrastructure for development and testing, allowing teams across the world to utilize a common infrastructure that would scale to meet customer needs. They looked at public cloud solutions to simplify their IT operations and provide high security. 

The company selected Bluelock due to their numerous security accreditations, strong physical security, and ability to act as a partner and advisor, not just an infrastructure provider. According to the operations director, the top benefits the company experienced from their cloud deployment includes:

  • Improved user experience
  • Better overall performance
  • Significant savings on capital outlay

Additionally, the company has shared, “Yes, the cloud is providing cost savings, but our development team has a lot more flexibility and this will create new options for our customers in the future.”

As the experiences of these two companies show, companies of all sizes and industries, even those that are heavily regulated, have seen the value of moving to the public cloud. 

For more information, download the Public Cloud Security Advantage Whitepaper, and stay tuned for Part II of this series, where we’ll be diving into the experiences of two iland customers in their move to the public cloud. For more stories from the VMware Service Provider and Partner Network, be sure to check out the Global Alliances Blog, and follow @vCloud and @VMwareSP for future updates.

Another VMware Cloud: Oxford University Runs Their Hybrid Cloud on VMware

Oxford University is the oldest university in the English-speaking world, with over 900 years behind its belt and 40 world-class leading universities to boot. For a university system renowned for its research, the ability to efficiently manage data for all 40,000 of its users is of the utmost importance.

Oxford University has been using VMware technology to virtualize their infrastructure since 2003, so VMware was a natural first choice when it came to undertaking their major Database-as-a-Service hybrid cloud project. Through VMware vCloud Service Provider Colt and VMware’s Hybrid Cloud Activate service, Oxford University was able to achieve what would have taken 6 or 7 months in a matter of weeks.

According to Dr. Stuart Lee, Director of Computing Systems and Services at Oxford University, the University wanted to offer a service that would allow them to quickly run up a virtual machine for users within the University’s colleges or departments – essentially enabling users to buy VMs for rent.  Oxford University received 1.4 million pounds from the UK as part of the UK’s promotion of shared services in higher education and use of the cloud, as well as a strict deadline.

By bringing together all of the University’s research data in a central place, Oxford University was able to: 

  • Assist researchers in managing the data they produced, while also allowing them to quickly tweak it.
  • Reduce the provisioning of towers and desktop machines across the university.
  • Increase efficiency by eliminating the need to build databases on an individual basis. 

 

With the guidance and expertise of VMware and vCloud Provider, Colt, Oxford University was able to realize their Database-as-a-Service goal, and created a self-service project that enabled users to quickly fire up a database within a central service.

Oxford chose VMware to move to the cloud for three key reasons:

  • VMware is the industry leader in virtualization;
  • VMware builds on your existing virtualization foundation;
  • VMware provides a complete solution stack to power your cloud infrastructure.

Whereas the standard process of building a database and ordering a server can take up to several months, Oxford’s DBaaS solution enabled the University to provision databases in minutes. 

So what’s next for Oxford University? The University’s future direction is now looking at providing Oxford’s Database-as-a-Service solution to other institutions in the UK as part of the UK’s promotion of shared services within higher education. They are also looking towards linking up with the UK’s national public cloud.

Visit Another VMware Cloud to learn more about VMware’s Hybrid Cloud Activate service and other companies who have successfully deployed a public or hybrid cloud model through VMware. Be sure to follow us on Twitter at @vCloud and @VMwareSP for more Another VMware Cloud stories! 

Big Data in the Cloud – Join Us for Our Next #cloudtalk

The conversation around big data has grown exponentially in recent years, with even the Obama Administration announcing a White House-sanctioned research and development push to promote the improved use of big data.

In IT, the term “big data” refers to datasets whose size makes it difficult to work with using common software management tools. What does this growing enterprise IT trend mean for the cloud? According to a recent article in Voice & Data, the most important driver for big data is cloud computing. This is because cloud computing makes big data possible by providing an elastic pool of resources that can scale accordingly to handle big data. Additionally, the cloud enables IT teams to be more efficient in their use of resources, so that they can invest more in big data.

For our next #cloudtalk on Tuesday, April 10th at 11am PT, we’d like to invite our service providers, partners and the larger cloud community to share how big data has affected their products and service offerings, and how they see big data evolving in the cloud in the coming year.

Here’s how to participate in #cloudtalk:

  • Follow the #cloudtalk hashtag (via TweetChatTweetGrid, TweetDeck or another Twitter client) and watch the real-time stream.
  • At 11am PT @vCloud will pose a few questions using the #cloudtalk hashtag to get the conversation rolling.
  • Tag your tweets with the #cloudtalk hashtag. @reply other participants and react to their questions, comments, thoughts via #cloudtalk. Engage!
  • #cloudtalk should last about an hour. 

In the meantime, feel free to tweet at us (@vCloud) with any questions. Look forward to having you join us next Tuesday for #cloudtalk!