Needfinding with Jobs-to-be-Done

Despite the steady ascendence of the importance ascribed to good design, there’s still a lot of misinformation out there. This diagram summarizes two common misperceptions about good practice that I come across: The-Twin-Antipoles-of-Design-Failure

Needfinding, a set of methods originally developed at Stanford over 50 years ago, offers a better alternative. The basic idea is that by focusing on a customer’s fundamental problems/needs (which rarely change) instead of solutions (which change a lot), your company remains anchored in a foundation that’s highly durable yet innovation-friendly. Tactically, it helps make sure that your customer research is actionable.

After reading this tutorial and engaging with the related practice you will be able to:

  1. Take new ideas and frame them in terms of an objectively testable customer need
  2. Operationalize your ideas on value propositions with a before and after storyboard
  3. Conduct customer discovery to refine and test your ideas on customer needs

Leaving Where Your Are, Going Where the Customer Is

I like to organize needfinding around three key elements: 1) problem scenarios 2) alternatives and 3) value propositions. If you’re familiar with the ‘jobs to be done’ framework, problem scenarios are very similar. The basic idea is to understand your customer’s fundamental problem/desire/habit/job/need/task, including (#2) their most preferred alternatives. Against these, you can both tune and test your value hypothesis.

The following diagram summarizes these items: Personas-Problem-Scenarios-Alternatives

I think you’ll find these problem scenario-alternative-proposition trios are more discussable, actionable, and testable than most alternatives. How many meetings have you been in where there’s argument about which features the firm should implement? Why not at least argue about which problems really exist and are important? At least then you’re much closer to what’s fundamentally relevant to the customer.

If you’re selling to businesses (B2B), ask yourself what fundamental jobs you’re doing for the customer- these don’t change over time. While the way we fulfill them change, neither do our fundamental needs and desires, which is what you’ll be looking at if you’re selling a consumer product (B2C).

Following identification and prioritization of your target problem scenarios, you’ll want to look at alternatives- how is the customer/user resolving these problem scenarios today? If the problem scenario really exists, they’re doing something about it now. Discovery around both problem scenarios and alternatives is much less costly than discovery on propositions for the simple reason that you don’t have to have anything built to acquire the answer. You just need to go out and observe your customer personas. Only after you have a working view on the relevance of problem scenarios and an understanding of current alternatives would I recommend evaluating value propositions.

I use an example company called ‘Enable Quiz’ in the materials on this site. They’re testing a solution for companies that hire engineers and want to better screen technical talent so their hiring process works better.

The key personas are Helen the HR Manager who’s acquires resumes and helps screen candidates and Frank the Functional Manager who determines his hirings needs and makes the final decision on hiring for the team he manages.

Here are a few notes on their problem scenarios, alternatives, and Enable Quiz’s ideas on value propositions for them:

PERSONA Helen the HR Manager Frank the Functional Manager
JOB-TO-BE-DONE “I need to screen recruits for specific technical skills in order to send Frank/Francine just recruits with the skills they want. “I have limited time and I don’t want to be a jerk. It’s hard to screen for all the relevant technical skill sets.”
ALTERNATIVE(S) – Call references- Take their word for it
A lot of unqualified recruits are ending up with Frank/Francine.
– A few probing questions- Take their word for it
VALUE PROPOSITIONS New ability for meaningful screening of technical candidates, increasing % of successful hires and lowering Frank’s workload on recruiting. Less time doing interviews, and better hires sooner.

Your key question around these problem scenario-alternative-value proposition trios is: How much better is my value proposition than the current alternative(s)? Is it ‘better enough’ that the customer’s going to buy or use my product?

In fact, you can roll all this into a testable formulation I like to call the ‘product hypothesis’ (though if could also apply to an individual feature) as follows: Lean Product Hypothesis

This leaves you with a highly testable view of your personas and whether what you’re going to deliver will hit the mark. Shortly, we’ll look at some ways to test that assumption, but first I’d like to show you how storyboards can help improve your personas.

Designing Testable Propositions

Once you’ve identified pairings of problem-scenarios and value propositions, I recommend drafting pivotal metrics to focus your discovery and testing. This will allow you to pair your qualitative hypotheses with quantitative evidence to focus your work and drive to informed, confident decisions. You may have testing you want to do earlier or later in the customer journey (ex: acquisition before or referral later), but for the core product design, I recommend the following breakdown: onboarding, engagement, and outcomes.

The following table summarizes the questions teams should answer as they test their value proposition relative to a target PS/JTBD.

Item Onboarding Engagement Outcome
What does this mean?
What is the interval?
How might we test this?
What are the metrics?
What’s tricky? What do we need to watch?

For examples of these, please see Exhibit A and Exhibit B below.

For a template to use in getting started, please see this template you can use in Google Docs (or download for MS Word): Hypothesis-Driven Development Template

Interviewing Subjects to Discover and Test Problem Scenarios

For this, I recommend you follow the same process I recommend for doing discovery on personas:

personas-process-v3

In practice, you’ll usually be doing this work in parallel with persona development. For details on the process, see: Personas Tutorial- A Persona Process.

Going Deep: Storyboarding a Before & After Scenario

Storyboarding-Three-Little-Pigs-SmallStoryboarding is a great way to show what you mean with your problem scenarios. Communicating is hard and arguing is worse. And we’re probably much less effective communicators than we think. With this format, we push ourselves to make communicate better and make sure we’ve really thought through our problem scenario as a testable hypothesis.

This format is actually a pair of storyboards, both describing the same problem scenario. The ‘before’ board shows the relevant personas using their current alternative to resolve the problem problem scenario. The ‘after’ board shows the same problem scenario with your value proposition in place. The board below illustrates the problem scenarios above around hiring engineering talent: Before-and-After-Storyboard

Here’s the narrative with a little more detail:

NOTES BOARD

Helen the HR Manager does an initial screening on Chris the Candidate. She can look at experience but doesn’t really have the ability to validate the candidate’s skill set.

Chris the Candidate is then passed along to Frank the Functional Manager.

Customer Development Storyboard-Before (1/3)

Frank the Functional Manager is really busy and just goes and make the hire.

Customer Development Storyboard-Before (2/3)

But in this case a stitch in time would have saved nine- the candidate doesn’t actually have the required skills to the degree Frank understood/expected/wanted.

Now Frank the Functional Manager has to figure out how to fix a situation where his employee doesn’t have the right skill sets.

Customer Development Storyboard-Before (3/3)

The next board narrates the process once the personas of interest have access to the value propositions the Enable Quiz product offers.

AFTER

NOTES BOARD

Helen the HR Manager now has a simple way to screen out candidates missing the skills Frank the Functional Manager has said are an absolute requirement.

Customer Development Storyboard-After (1/3)

Making good hires is rarely easy but Frank the Functional Manager now at least knows they’ll have a certain baseline skill set.

Customer Development Storyboard-After (2/3)

And life’s a lot better.

Customer Development Storyboard-After (3/3)

Example A- Enable Quiz (Startup)

The following Problem Scenarios are for ‘Helen the HR Manager’:

Problem Scenarios/
Jobs-to-be-Done
Current Alternatives Value Propositions
Helen screens engineering candidates for open positions, sending only qualified candidates to the hiring manager. Many of the skill requirements are outside her background.

Engagement Metrics
[Positions Opened]
[Positions Filled]
[Candidate Interviews]

She calls references to get a general sense of their performance on the job. We’ll offer her a new capability for meaningful screening of technical candidates, increasing % of successful hires and lowering Frank the Functional Manager’s workload on recruiting.
Helen writes job descriptions with Frank the Functional (hiring) Manager. In a perfect world, they’d continually improve these based on hiring outcomes and employee job satisfaction.
Engagement Metrics
[New Positions Created]
[Revisions per Position Description]
Right now she just gets lists from the Functional Manager and basically passes them through (to job postings, etc.). We’ll offer a best practice of menu of possible skills linked to popular job descriptions. They can then pair each of these with quiz content to assess candidates’ familiarity with target skills.
She’d like to put a company-wide professional development program in place. A few of her peers at bigger companies are doing it and the employees love it. Vendors have come in to see her with various programs but she isn’t sure how to assess what skills the employees and managers would like to develop.
Engagement Metrics
[Company-Sponsored Courses Completed]
Right now she just works with the functional managers if she can convince them to do something on a case-by-case basis. Presenting the quiz application as an entry point for the HR manager to make it easy for the functional manager to assess a starting point for a skills management program might be a good way to deliver on this problem scenario.

This is a view of how the team plans to instrument testing and observation into the first problem scenario dealing with screening engineering candidates:

Item Onboarding Engagement Outcome
What does this mean? Creating and using at least one quiz Consistently using the tool to screen candidates In the shorter term, reducing the workload for interviewing. Longer term, increasing retention as a proxy for employee/firm fit.
What is the interval? 60 days 120 days (or >1 hires) 365 days (est.)
How might we test this? Usability testing then concierge onboarding tests with selected cohorts Observe cohorts across onboarding programs & product iterations; interviews with concierge subjects Observe outcome metrics across cohorts
What are the metrics? 1) Quizzes created 2) Quizzes tested (post-completion/pre-interview) 3) Quizzes Completed Quizzes Completed/Candidates Interviewed 1) Candidates interviews/hire (how many interviews across the funnel?) 2) Candidates interviewed/hire (how many candidates through the funnel?) 2) Employee retention vs. baseline
What’s tricky? What do we need to watch? Onboarding is tricky and involves multiple parties (see previous) The observation intervals are long. As we learn more about the process, we should look at ways to decompose these observations and conclude on them sooner, if possible.

Example B- HVAC in a Hurry (Enterprise)

Getting replacement parts to a job site emerged as a favorite problem scenario for the team to focus on for a possible software solution. It was substantial, but not too large. Being able to order a part online (vs. through the central office) looks pretty tractable with software, at least at first glance. The table below describes a few problem scenarios/JTBD for Trent the technician.

Problem Scenarios/Jobs-t0-be-Done Current Alternatives Value Propositions
Finding up to date reference documentation in a usable format

Engagement Metrics
[Number of Repairs, Number of Times a Manual Referenced]

Carry printed manuals
(They think- but later they find out the tech’s are actually Googe’ing vendor documentation and that works fine)
We’ll offer a library of vendor manuals that are indexed and searchable in a consistent way tailored to the technician’s needs
(They were able to eliminate this VP with customer discovery- the Alternative actually works quite well.)
Getting replacement parts to a job site.

Engagement Metrics
[Replacement Parts Ordered]

Call the office and request the part then wait for an update on the phone or through a call-back We’ll create a self-service parts ordering process with greater transparency on cost and turnaround time, lowering overhead for fulfilling a parts orders.
Getting all the necessary information to arrive at a job fully prepared.

Engagement Metrics
[Number of Repairs, Number of Calls to Dispatch Prior to Job, Number of Return Visits/Job]

What the customer tells dispatch isn’t always conveyed or consumed by the technician and the customer ends up repeating themselves or the technician ends up with less information than they need

Trent often calls dispatch on his way in and try to get a quick briefing; he reads the notes on the job if they’re available.

We’ll create a more structured, automated request and dispatch process with better routing and storage of notes from the customer

Now that the team’s found a problem/job worth solving, they turn their attention to creating a valuable solution, one that’s better enough than the alternative to drive action and materially better outcomes for the user.

This is a view of how the team plans to instrument testing and observation into the first problem scenario dealing with getting replacement parts to a job site:

Item Onboarding Engagement Outcome
What does this mean? Signing up and ordering one part Consistently using the tool to look up and order parts This is a combination of reducing the overhead to complete a job and increasing customer satisfaction.
What is the interval? 1 day 30 days 90 days (est.)
How might we test this? Usability testing then shadowing rep’s with various options or mandates to use the tool. Observe cohorts across onboarding programs & product iterations Observe outcome metrics across cohorts
What are the metrics? Number of Signup’s & Signup’s with >0 Orders Parts Ordered Online/Parts Ordered by Tech >80% 1) Reduced Job Time
2) Reduced Turnaround Time
3) Increased Customer Satisfaction/Job
What’s tricky? What do we need to watch? Pulling vs. pushing the use of the tool Engagement vs. Outcomes with pulling vs. pushing (see previous)

As the team looks at the specific user story its currently implementing against the above PS/JTBD, they have the following ideas on analyzing the individual stories once they’re live:

USER STORY ANALYTICS
I know the part number and I want to find it on the system so I can find out its price and availability. How many searches of this type?
In what sequence?
Which led to to an order?
I don’t know the part number and I want to try to identify it online so I can find out its price and availability. (see above)
I don’t know the part number and I can’t determine it and I want help so I can find out its price and availability. (see above)
I want to see the pricing and availability of the part so I decide on next steps and get agreement from the customer How many orders of which parts?
Relationship to all parts ordered {/tech}?