Last year at VMware Explore Europe, our EUC Keynote featured an interview with Bob Smart, Senior Solutions Expert at Orange Business Services (OBS). Bob shared details of a new voice-enabled support service offered by OBS.
Driven by a focus on user experience, OBS identified the helpdesk as a particular point of friction for hybrid employees. When you’re not in the office, you can’t lean over to ask your colleagues for help, and you might not want to take the time to put in a ticket or call someone.
In response, the team at Orange put together a service that takes voice recognition, natural language processing, and robotic process automation from ybot, and combines it with the management and digital employee experience capabilities of Workspace ONE.
There are several trends overlapping here: self-service and reducing helpdesk costs; automation; and voice-based user interfaces and natural language processing.
We wanted to talk more about these trends and how Orange is working with VMware to create this offering, so I spoke to Bob to take another look.
Jack Madden, EUC Product Marketing, VMware: The idea of reducing helpdesk tickets and achieving the associated ROI is something that we hear about frequently. As you’re rolling this out, what are the types of helpdesk tasks that are most suited for automation? What are the quick wins for customers?
Bob Smart, Senior Solutions Expert, Orange Business Services: There are user calls that are very repetitive for service desk agents. These calls may require low technical inputs, but they take time for the agent to resolve. They may also distract the agent from more important tasks or projects. When I was working in retail, for instance, we had 200+ calls a month just for password resets.
We work with operations teams to identify the top three to five recurring calls to their service desks. When you consider that every agent call can cost somewhere between $20 and $30, compared with a per-user, per-month license, then the ROI is very good.
Jack: I think the part that may be newer to EUC teams is taking the natural language queries and transforming them into the API calls for Workspace ONE. How does this training process work?
Bob: We work with customers to identify key trigger words — much like Siri or Alexa in the consumer market. The middleware can intercept these trigger words, and then action backend systems or processes, using the documented API configurations. Any calls or voices not understood are diverted to the service desk as normal.
We install and check, and then improve, during the user acceptance process. The systems also get richer over time. We learn or train the middleware, so that more resolves can be made, more languages added, and more use cases supported.
Jack: For this offering, what do customers need to do to get started?
Bob: As mentioned, the customer first needs to be analyzing their top three to five repeat calls to their service desk and which of their backend systems needs to be accessed to resolve these queries.
The work that I particularly like is supporting hybrid workers with PC-related issues. By combining the middleware with Workspace ONE and DEEM (Workspace ONE Digital Employee Experience Management), we can both resolve PC “go-slows” and proactively manage the PC hardware so that break-fix becomes proactive rather than a response to a user request.
Jack: What apps or agents can customers use to interface with the ybot middleware?
Bob: We are able to take most voice inputs into the system. We see a lot of Microsoft Teams, Cisco Webex, and Zoom, but any voice input can be translated into code.
Jack: As consumers, we’re used to asking the various digital assistants in our lives to set a timer, turn on the lights, or send a message. And most people have interacted with chatbots for things like customer support. So naturally, IT support seems like a great use case that people should get used to. But you’re also envisioning a variety of industry vertical use cases. Many frontline roles I’m sure could benefit from hands-free interaction. What are some of the top use cases and early successes that you anticipate?
Bob: Actually, if we go back a step first, let’s agree that interfacing with a remote service desk can at times be complex, frustrating, and take a long time, with the resolve measured in hours. We do prefer a local service desk where we can take our PC to a colleague and then describe the issue, they explain how the resolve works and take you through the steps, for example.
What the middleware in this offering does is to personalize and speed that service — especially for hybrid or remote workers. You speak directly to your end point and the resolve is triggered.
New entrants to the workforce need to be understood too. Anecdotally, we suspect that younger generations, in particular, do not like speaking to the service desk on the phone. Indications are that users will prefer self-service or self-resolve processes.
As we watched the technology being used, we realized that there are certain use cases that would prefer “screenless” computing, using the voice bot as the primary interface — this would give the ability to speak commands and have data or information spoken back to them. We have worked with emergency services and logistic companies to trial screenless computing. Another use case that we are considering is AR and VR headsets and glasses, where the configuration could be simplified with both Workspace ONE and the voice middleware.
Thanks to Bob and OBS for sharing more about this offering.
To learn more, read about the offering from Orange Business Services and the VMware Digital Employee Experience solution.