If you’ve ever sat through a VMware Certified Advanced Professional (VCAP) Deploy certification exam with VMware, you will be familiar with our performance-based testing format. Performance-based exams (also known as lab exams) are an alternative to traditional multiple-choice exams used to test a candidate’s ability to carry out specific tasks. Think of it as a way to “test by doing” rather than a test of knowledge. For candidates in certain job roles and at certain levels in their careers (as well as for employees wanting to make sure they hire the right caliber of candidate), it is the testing format of choice providing a real-world setting whereby candidates can validate their knowledge and skills in a more practical hands-on way, just as if they were “on the job.” That said, there is still a very valid use case for the more traditional multiple-choice testing format, like for example when validating more analytical problem-solving skills or otherwise theoretical knowledge. It is why at VMware, we use both testing formats and have every plan to continue doing so as dictated by the job roles targeted and the skills being validated.
So now that you know WHEN and WHY we use performance-based testing, ever wonder HOW these lab exams are made?
Certification Insights: How VMware Lab Exams are Made
At VMware, we have a Hands-On Lab (HOL) – Certification team that we partner with when developing a new lab exam. The HOL-Certification team is constantly identifying product(s) and/or features that should be showcased in an HOL environment available to anyone customers and partners alike. These HOL environments provide a free and easy way to access VMware products and solutions, testing use cases and learning about the latest features with no installation required.
When kicking off a new lab-based VCAP Deploy exam, the VMware certification team engages the HOL-Certification team to identify an already existing HOL template to facilitate the exam request. The certification team works with the group of subject matter experts (SMEs) developing the exam items to storyboard and identify any changes needed in the HOL environment to facilitate the items. The HOL-Certification team implements the necessary changes to the environment per the SME recommendations. This process is a very iterative one, mostly because items need to be written as independent of one another and tweaked accordingly.
Once the exam items and their associated lab environments are developed, it is time to develop the grading script for scoring these. Following formal training, each SME is responsible for developing their respective scoring script (for the question they developed) using a template. This process (an iterative one as well) entails extensive validation by the SMEs and assurance that the individual scripts are free of error. The certification team works with individual SMEs to ensure script development guidelines and deadlines are met. Once ready, the HOL-Certification team combines the individual scripts into one master grading script for another round of validation. The certification team, volunteer SMEs and HOL-Certification team members thoroughly test the master scoring script in a production setting and any required changes are made accordingly.
Because the development of these lab exams is so unique, they do require a unique set of skills by SME item authors. Not only do SMEs need to be experts in the technology/product(s) covered in the certification exam but they must also have extensive background in scripting and storyboarding. SMEs go through a vetting process by the certification team lead and are selected based on a set of criteria.
Christopher Lewis, Lead Solutions Architect – Multi-Cloud Management, is a seasoned certification subject matter expert who has contributed to developing both traditional exams and performance-based lab exams at VMware; therefore, he has a very unique perspective on the process.
Writing performance-based exam questions is a creative journey that really pushes me. Each question starts as a spark of an idea based on the specific exam/topic/objective that needs to be tested. Typically, each SME writes more than one question at a time. My ideas are normally taken from real world scenarios or customer interactions because I believe that makes them more plausible and realistic. Once I have the outline documented, this forms the first draft of the question or idea. I then review the lab environment to see if the question I want to write is actually possible in the lab environment we have (normally we make a few subtle changes). If the question isn’t going to work, then it is back to the ideation stage. If it is going to work, I document all the changes needed to support the question making sure no other questions are impacted by the changes. I then ensure the question stem is written before I document all of the steps I would expect the candidate to take to answer the question successfully. We do this so we can then easily create scoring scripts later in the process.
Obviously, there is normally a number of different ways to get a particular outcome, so we have to take that into consideration too. At this point I present the idea back to the team and, typically, the Certification Team Leader has to remind me there is a time limit for the exam and my question is not the only question in it. I then revise the question to remove some of the more mundane and repetitive steps that would be required in the real world. This makes sure that the essence of the question and, more importantly, the task that the candidate needs to complete remains intact but the time to answer the question is reduced. Once all the changes have been made by me, the question is then reviewed by other SMEs and additional revisions continue until we have an agreed upon question that is legally defensible.
So, now that you know HOW they are made, why not try your luck at a VMware VCAP Deploy lab exam today! There are Exam Prep Guides, VMware Learning Courses, Learning Paths, and Hands-on Labs designed to help you prepare!