A new customer technical case study on Skyscape's use of vSphere as their platform for deploying Hadoop in the cloud was published recently. Skyscape, based in the UK, deploys Hadoop clusters on demand for their UK Government customers from the company's public cloud infrastructure. Citizen services data and analysis tools are provided by these government departments that leverage Hadoop for data gathering and analysis.
The newly provisioned Hadoop clusters are based on the Hortonworks HDP platform today, but plans are in the works for providing other Hadoop distributions also in the future. The Skyscape engineers really innovated in an impressive way on the Big Data Extensions (BDE) platform. The system not only provides the end user with a Hadoop cluster capability but also with an Ambari Server of their own to manage and monitor their Hadoop cluster. This is all done on X86 hardware servers with direct-attached storage. Skyscape also made use of the BDE REST APIs to achieve their goal. They had five separate end-user customer groups signed up for use shortly after releasing the Hadoop service to their community.
Two other very interesting and useful blogs on virtualization of big data appeared recently: one on Using Big Data Extensions 2.2 written by Julie Roman, a Technical Account Manager at VMware who has worked on big data projects and another (from LinkedIn) on Big Data as a Service by George Trujillo, who is a VP at a Financial Services company. Both of these are very useful reads on their respective areas!
Our next webcast in the vSphere 6 webcast series is all about increased efficiency of running your data center via automation. Brian Graf, VMware's PowerCLI guru, will discuss what's new in vSphere 6 for PowerCLI as well as show off some tips and tricks that will wow you.
This webcast takes place July 7 at 9am PST. Register for the webcast today!
Next up in the vSphere 6 Webcast Series is the very important topic of business continuity. For many, business continuity equals business productivity and application downtime needs to be minimized or eliminated. Learn how vSphere 6 reduces downtime for applications and maximizes productivity for businesses. Matt Meyers will lead our discussion on availability and data protection.
This webcast will occur on Tuesday June 30 at 9am PST and there will be a team present to answer questions you may have. Register today for the free live webcast.
We are adding a special webcast to the vSphere 6 Webcast series.
The next webcast will be on the topic of agile Big Data happening on June 23 at 9am PST.
Learn how vSphere 6 + Big Data Extensions 2.2 can help you build an agile Big Data platform. Provision your Hadoop clusters faster, simplify configuration and management of your nodes, efficiently scale and utilize resource saving you time and money, all with no impact on performance. Justin Murray, our Big Data guru, will be sharing his insights and answer any questions you may have.
Not yet on vSphere 6? Join us for a webcast to learn why you should be. Starting June 2nd, 2015 and recurring every other Tuesday at 9AM, join the vSphere product experts to learn what’s new and exciting about vSphere 6! A different topic will be covered each session and time will be allocated at the end of each webcast for Q&A.
Please always check the latest schedule each week as topics may change and sessions may be added or removed.
In the previous post Configure DHCP and TFTP for Auto Deploy, we discussed how to setup your DHCP and TFTP servers to allow your ESX hosts to PXE boot. However, once an ESX host boots, it will need directions to know what to boot. This is where Auto Deploy Rules come in. Continue reading →
When you think of VMware, virtualization clearly jumps to mind. But if you take a step back, virtualization is really a means to an end. IT pros don’t earn their salary because they run virtual machines -- but VMs support application services that are essential to business, ultimately contributing to the bottom line. VMware is focused on providing the best place to run any application; from LAMP stacks to business-critical workloads to big data analytics, vSphere can handle it all.
Two open source projects were just announced by the Cloud-Native Apps group: Project Photon and Project Lightwave. Both of these projects will be foundational elements for running Linux containers and supporting next-generation application architectures. This marked a big milestone in the lifecycle of VMware Cloud-Native Apps, and at first glance may seem to be a lot more relevant to application developers than the traditional vSphere audience, but there really is a great tie-in to the Software-Defined Data Center.
If you’re a vSphere administrator, an important part of your role is supporting the developers that create the apps that run on your infrastructure. There is a shift underway with developers right now – moving from a traditional waterfall model to agile, continuous integration. For a specific example of the change in mindset from previous software development processes, check out The Twelve-Factor App to see why the container enthusiasm starts to really make sense.
Today, customers trust their Software-Designed Data Center based on VMware infrastructure for any app. It would be a shame if a new platform for applications came along and brought back the silos of yesteryear. This is why vSphere admins should care about next-generation applications and corresponding infrastructure. The container runtime becomes another essential component of the infrastructure and it should be integrated for seamless operation. With Photon, VMware is going to make it easy to run containers alongside all of the other workloads - no silos here!
Photon is going to be available in places where developers expect to find it. For example, many developers use HashiCorp Vagrant as an easy means of pulling down standardized VM images from a central repository. A Photon image will be available there and elsewhere, enabling the same container runtime on laptops, in the datacenter, and in public clouds.
Administrators will like the fact that Photon has a small footprint because it is not weighed down with all of the packages typically found on a Linux system, and one can draw parallels with the VMware ESXi thin hypervisor. Less is more when it comes to infrastructure – fewer patches, less administration, and improved SLAs are among the key benefits.
The companion open source project - Lightwave - is an authorization and authentication platform with origins from the vSphere platform. It provides multi-master replication for scalable HA and flexible topology choices to accommodate any architecture.
There is great integration between Lightwave and Photon. In fact, Lightwave is designed to actually run directly on Photon instances – no general-purpose OS needed. Take a look at this demo video where a new Lightwave domain is created, Photon clients are joined to the domain, and ssh logins are authenticated against directory credentials, eliminating the need to manage local user accounts.
Linux containers are all the rage right now, but it’s not a zero-sum proposition. Containers run great on vSphere and VMware is investing accordingly. VMware SDDC administrators can be confident that their platform is, and will be, the best for any application - with the security, manageability, and governance that enterprises need.
Software-Defined Storage is making waves in the storage and virtual infrastructure fields. Data and infrastructure are intertwined, and when they’re both brought together, companies can cut down on expenses and increase productivity.
Rawlinson Rivera, Principal Architect, Storage and Availability, recently hosted a webinar, discussing how VMware is approaching Software-Defined Storage (SDS) and virtualization in recently announced VMware updates, including updates to VMware Virtual SAN 6.0.
Software-defined storage offers organizations the ability automate, distribute and control storage better than ever before. SDS can provision storage for applications on demand and without complex processes. It also allows for standardized hardware, reducing costs for businesses everywhere.
To bring the customers the best software-defined storage experience to realization, we had to update VMware® Virtual SAN™. And we did just that. With VMware Virtual SAN 6.0, we introduced several new features with SDS in mind:
Software-defined storage optimized for VMs
All Flash architecture
Broad hardware support
The ability to run on any standard x86 server
Enterprise-level scalability and performance
Per-VM storage policy management
And a deep integration with the VMware stack
There’s a lot more to unpack from the latest updates to our VMware solutions. For a more in-depth guide to what’s new and how it affects you, watch the webcast here!
Rawlinson Rivera announced, and explained the inner workings of, VMware Virtual SAN 6.0 — VMware’s latest software-defined storage product. Virtual SAN 6.0 introduces support for an all-flash architecture and hybrid architectures, among many other innovations. This is a blog post you shouldn’t miss.
We released vSphere Virtual Volumes (VVOLs) alongside the announcement of vSphere 6.0. In this post, Rawlinson Rivera explains how VVOLs can drive more efficient operational model for external storage.
The latest release of SAP HANA has brought the concepts of multiple-temperature data and lifecycle management to a new level. Bob Goldsand talks more about this, as well as native use cases and dynamic tiering with VMware HA and workload management.