pivotal_labs product_management

7 Tools and Tips for Tracking Your Experiments

 

Advice from several product managers on how to track your assumptions and hypotheses

Over the past 6–7 years, product teams around the world have changed their approach to building products. New concepts and principles such as The Lean Startup have introduced new techniques to validate features, ideas, and products which keep risk low and value high.

We track assumptions we have about our ideas and, through hypothesis-driven development, we test out our riskiest assumptions that could invalidate our ideas. At Pivotal, we have some techniques that we teach our clients. But beyond using the hypothesis template, what is it that teams are doing to track this work? And what can we learn from them?

Since these techniques are so new to software, we’ve noticed different teams experimenting in different ways. To find out just how teams across the world have iterated on this, I interviewed product managers in our San Francisco, Detroit, Tokyo, and Boston offices.

Below are just a few tools and tips these Pivotal product managers were able to share with me.

1. Feel like you’re not taking actionable feedback from experiments? Capture what you know right now to fight bias.

A consistent message in practicing lean is capturing your assumptions. We’re often wrong about what we think is right, or how our customers will react to an idea. Yui Anzai, a product manager in Pivotal Labs Tokyo, works with clients to write down “what do we believe right now, and what do we want to prioritize? Based on what we found, were these things correct? Not correct? What are our actions? Have we validated enough to feed into development?”

Jay Badenhope, a product manager in the Pivotal Labs San Francisco office, adds:
“Put a stake in the ground, and learn from it. Write yourself a letter and seal it in an envelope and see what happens. It fights confirmation bias.”

Simply stating what is known at the moment and being intentional about following up on what we’ve learned can help teams be honest with themselves.

2. Have teams that get discouraged with experiments that don’t work? Keep a “Learning” not “Failing” mindset.

It’s easy to think of failing as being negative and teams can easily be discouraged by an experiment that “doesn’t work”. Take a different spin on experiments that fail.

“If you invalidate your assumption, do not say it failed, you successfully invalidated your hypothesis.”  says Jay B.

As Jay mentions above, you should shift your view from “failing” to “learning”. Failing is not a bad thing, but if you’re team is experiencing a mental block of next steps or taking action, try turning that mindest into a “learning” mindset.

“We manually emailed those drivers at the end of the day with their idling data.”

3. Not sure how to test an idea fast? Fake a (risky) feature

Somesh Bladawa, a Pivotal Labs product manager in Michigan describes below how he and his team faked an experience before building an additional feature:

Context: We are developing a telematics product. We worked with a fleet of hundreds of vehicles as a client.

Problem: We observed hours of vehicle idling that resulted in fuel waste.

Goal: Reduce fuel waste by reducing idling.

Experiment: Our hypothesis was that if we contacted the egregious drivers directly (instead of giving the wastage report to management) and told them that they are above average in idling then they will reduce their idling time.

MVP: We got the idling numbers in our application and selected the 5 most wasteful idling drivers. We manually emailed those drivers at the end of the day with their idling data.

Observation: We observed that the drivers reduced idling significantly after receiving our email. Some also responded to us saying they will try to reduce idling waste.

Somesh attested that “this experiment helped us prove the hypothesis before we wrote any code to build this functionality. It also proved that this approach can outperform the approach every competitor in the industry took (i.e. emailing the supervisor, or showing the driver score in supervisor dashboard).”

4. Having trouble framing and taking actions with your assumptions? Try this structure.

Christopher Senso, a product manager in the Boston office explains how they construct learning statements before kicking off research. “First we were at the whiteboard generating stickies in three phases:

When we started, we thought that ______________ (dump assumptions)

Then we learned _____________ (Key things we learned during D&F)

Which is why _____________ (decision we made, or new hypothesis to test)

“This led us to write D&F (Discovery & Framing) learning statements that we presented to our stakeholders.” This is a great way to share the direction you’re heading on an idea and what you’ve learned so far with anyone who is on your team or is involved with your product.

Let’s take an example from Airbnb, when they were attempting to improve their bookings early on in their journey.

When we started, we thought that

  • There were several reasons that our listings were not getting booked

Then we learned

  • That all 40 listings had terrible pictures

Which is why

  • We want to travel to some of the hosts
  • And we want to take professional pictures of each listing

5. Stuck on generating assumptions? Write a press release for your product when it’s a huge success — and then pick apart all of the assumptions that underpin it!

Struggling to come up with goals or assumptions? Try a technique first popularized by Amazon: writing a future press release.

From Jay B: “Have the team write a press release — what does the business look like and what are customers saying? Can you quantify that? What numbers would you put in there? From this, we can tease out what are the qualitative assumptions to give you a start.”

This approach can often help the team bring any assumptions to light if they haven’t come up already.

6. Working with remote teams? Use a shared, digital whiteboard to manage and track your assumptions and experiments.

Rhea Kaw and Usha Ramachandran in our San Francisco office work on Pivotal Cloud Foundry and recently experimented with a realtimeboard to track their assumptions and experiments. Rhea says “It can seem daunting to be in a new space [tracking assumptions], but this helps frame and prioritize the team to come together on the most important thing we need to learn.”

Usha and Rhea had another point to make in tracking assumptions. They mentioned that as you generate your assumptions, they “found that if not everyone participating didn’t have the shared context the quality of the assumptions varied and weren’t always applicable.”

7. Struggling to tie your experiments to your organization’s goals/outcomes? Try this template.

Jay B. has put together a quick Google Sheet that captures not only the high-level vision and goals, but the experiments your team runs. This reminds the team to assess how much risk each assumption presents through the “lens” of the product vision: how likely is the product to fail if you’re wrong about this? One very important piece of this is your “Next Steps” section. It’s easy for teams to develop confirmation biases and simply run an experiment identifying what actions you would take if you’re experiment fails. This “Next Steps” section is a good place to document that.

Everyone has their own flavor. Once you understand the basics, you can tweak your techniques to fit you and your team's needs. These are just a few techniques and formats our teams have tried through iteration. If you have any ideas or things that have worked for your team, please feel free to share in the comments.

Change is the only constant, so individuals, institutions, and businesses must be Built to Adapt. At Pivotal, we believe change should be expected, embraced, and incorporated continuously through development and innovation, because good software is never finished.


7 Tools and Tips for Tracking Your Experiments was originally published in Built to Adapt on Medium, where people are continuing the conversation by highlighting and responding to this story.