Instrumeting for Observation

Note: This case is the ‘sequel’ to A Design You Can Test and assumes familiarity with its contents.

From the Big Picture to the Small Picture, And Back

The digital team at HVAC in a Hurry is charged with improving the work of the company’s field technicians with digital. After conducting interviews they concluded that the problem/job of getting replacement parts to the job site was a good place to start. You can find their notes on the jobs-to-be-done they observed and the metrics they identified as focal points for testing solutions in Exhibit A.

They then concierged (hand tested) a solution and after iterating on that moved forward to a digital design. That design is anchored in a set of user stories and from there the team explored prototypes, which they tested with the technicians. Based on a certain stated goal, which prototype made it easiest for the technician to achieve that goal?

The team also translated their ‘big picture’ view of how to observe improvements in the process to the ‘small picture’ the individual user stories. For example, how does user engagement with a given search function relate to the bigger picture of onboarding, engagement, and retention with the parts app as a whole? You can see the results of that work in Exhibit B.

Keeping Your Eye on the Ball

The team is fundamentally committed to using evidence to focus their work and the drive to value. They’ve taken the focal points you see in the user stories table under Analytics (Exhibit B) and implemented them in Google Analytics for quantitative data and with Mouseflow for qualitative observation.

You’ll find notes on their use of Google Analytics in Exhibit C and Mouseflow in Exhibit D.

For your project:

  1. Make sure your ‘big picture’ focus is clear through job(s)-to-be-done and metrics implementation of them through onboarding, engagement, and retention.
  2. Make sure your ‘small picture’ focus is clear through testable user stories and individual metrics that you can relate back to your ‘big picture’.
  3. Implement and validate your quantitative observations with Google Analytics.
  4. Implement and validate your qualitative observations with Mouseflow.

Exhibit A- Jobs-to-be-Done and Implementation Metrics

Jobs-to-be-Done

The table below describes a few problem scenarios/JTBD for Trent the technician.

Job-t0-be-Done/Problem Scenario Current Alternative(s) Value Proposition
Finding up to date reference documentation in a usable format Carry printed manuals
(They think- but later they find out the tech’s are actually Googe’ing vendor documentation and that works fine)
We’ll offer a library of vendor manuals that are indexed and searchable in a consistent way tailored to the technician’s needs
(They were able to eliminate this VP with customer discovery- the Alternative actually works quite well.)
Getting replacement parts to a job site. Call the office and request the part then wait for an update on the phone or through a call-back We’ll create a self-service parts ordering process with greater transparency on cost and turnaround time, lowering overhead for fulfilling a parts orders and improving customer satisfaction.

Implementation Metrics

The team identified the following focal questions as they observe how various cohorts of technicians move through the process of adopting the new tool (or not), creating habits around using it (or not), and how they’ll know if it’s actually improving their work outcomes (or not).

Onboarding Engagement Outcomes
What does this mean?

What is the interval?

How might we test this?

What are the metrics?

What’s tricky? What do we need to watch?

What does this mean?

What is the interval?

How might we test this?

What are the metrics?

What’s tricky? What do we need to watch?

What does this mean?

What is the interval?

How might we test this?

What are the metrics?

What’s tricky? What do we need to watch?

The team arrived the following view of how that might look:

Onboarding Engagement Outcomes
Signing up and ordering one part

Interval: 1 day

Testing: Usability testing then shadowing rep’s with various options or mandates to use the tool.

Metric: Number of Signup’s & Signup’s with >0 Orders

Pulling vs. pushing the use of the tool

Consistently using the tool to look up and order parts

Interval: 30 days

Testing: Observe cohorts across onboarding programs & product iterations

Metrics: Parts Ordered Online/Parts Ordered by Tech >80%

Engagement vs. Outcomes with pulling vs. pushing

This is a combination of reducing the overhead to complete a job and increasing customer satisfaction.

Interval: 90 days (est.)

Observe outcome metrics across cohorts

Metrics: Reduced Job Time;
Reduced Turnaround Time;
Increased Customer Satisfaction/Job

Exhibit B- User Stories

The team is focused on implementing the epic story below. This story is just part of the larger arc of identifying a part, ordering it, getting it on site, and completing a repair.

Epic User Story

‘As Ted the HVAC technician, I want to know the pricing and availability of a part that needs replacing so I can decide my next steps.’

Storyboard

hvac-epic-story
User Stories & Metrics

User Story Analytics
I know the part number and I want to find it on the system so I can find out its price and availability. How many searches of this type?
In what sequence?
Which led to to an order?
I don’t know the part number and I want to try to identify it online so I can find out its price and availability. (see above)
I don’t know the part number and I can’t determine it and I want help so I can find out its price and availability. (see above)
I want to see the pricing and availability of the part so I decide on next steps and get agreement from the customer How many orders of which parts?
What’s the relationship to all parts ordered {/tech}?

Exhibit C- Implementing Quantitative Observation with Google Analytics

  • review of what they’ve done- how you can observe an individual user journey on Google Analytics (possible delivered via video)
    • validation/test orientation
  • review of how they did it (WP templates, Goals, Tag Manager…etc/whatever)
  • notes on getting started with Google Analytics

Exhibit D- Implementing Qualitative Observation with Mouseflow

  • review of what they’ve done- how you can observe an individual user session on Mouseflow (possible delivered via video)
    • validation/test orientation
  • review of how they did it
  • notes on getting started with Mouseflow