Master the art of prompt engineering to get the most out of Large Language Models like ChatGPT. Learn proven techniques to navigate memory limitations, employ variables for flexibility, and eliminate chatty outputs for efficient and accurate results. If you're serious about implementing Jobs-to-be-Done, you'll find serious tools developed by a highly trained practitioner.
00:02 Alright, this one's going to be a lengthy one, but I wanted to take some time and help those of you who don't have a fundamental grasp of the jobs to be done appropriately.
00:17 Which I use, which is based on outcome driven innovation and not switch interviews. I just wanted to give you some perspective as to what it is this masterclass covers, which portion of that type of research it covers.
00:32 Today covers a lot, it automates a lot, but the approach is very different than maybe what you're used to. So I wanted to kind of paint that picture for you and explain my way through this a little bit about what all the different components are.
00:49 That you're going to find in this in this masterclass and in the catalogs that we generate. So let me see if I can get my slide to move forward, there we go.
01:04 So this approach to job to be done has basically six parts to it. We can break it down into three qualitative, quantitative, and strategy, but I want to break it down a little bit more than that because each of these areas really kind of require a different competency.
01:23 And to some degree, these different areas may be automated just like I'm doing with this first this first phase. And it probably won't be in the too distant future.
01:36 Just saying. So the first piece, what we're addressing here is really building a value model. I know a lot of qualitative research is designed to get to insights as a part, as an end output of that research, but I think we can all agree that those, that the durability of those types of research projects
01:59 is pretty low. They don't last long. You know, different group of people, you get a different set of insights. So what this is designed to do is build an extremely robust and stable model.
02:12 And it does that by abstracting itself away from any solutions or any journeys or anything. Well, we can do journey work, but I mean, the solution portion of any of that stuff, because the solution.
02:25 Solutions are what change over time and jobs, jobs to be done are stable over time. And that's what we want is the foundation for, for our research.
02:34 So that's what we're focused on in this masterclass is building that model. It's a lot of work. And it's the proprietary part of this approach, really.
02:44 I mean everything else, you know, data science is data science, but the, the value model and the way you build that up is fairly proprietary.
02:54 So that's, that's why automating, it is pretty important. So the next part, once we've done that, that becomes the backbone of a survey.
03:05 So now that we have this logical model that's solution agnostic and all the metrics that end users or customers might use to measure success along the way we can put that into a survey and we'll get into a little bit of the detail but not a ton.
03:22 That survey is designed to prioritize the model in the market so that we have we can generate insights. So the next step is really kind of take the outputs from that survey, organize that into a data model that we can then analyze.
03:40 And once we've gone through our analysis which really should align with a like, a research imperative or a research plan that we've designed even prior to building the value model then we can go and take that, the answers that we come up with and the insights that we, we, we surface and the value gaps
04:00 we find in the market. And formulate a strategy or a set of potential strategies. And only then can we really craft a story.
04:11 So, you know, the executives in your organization or your clients, stakeholders, they're gonna be more receptive to a compelling story that is not a bunch of dots on a slide.
04:28 They want to see real people, real emotions, real use. Real scenarios and, and that's really what your job is there.
04:37 So you can't do that until you have all of these other components. But let's take a look at the, at the value model first.
04:50 I've overrepresented the complexity here. I, I'm trying not to use other people's work product to describe this, but essentially it's a bunch of pieces that we put together.
05:01 And at the center of that is what we're really trying to stuff. Either a job to be done in a job map or it could be a set of jobs or, or something else at the core.
05:12 And then all of these other elements that are orbiting it. So anybody familiar with data science may have heard of the star schema.
05:20 Where you've got a fact table in the middle and then you've got dimensional data all around it that allows us to take a look at the core data through a different set of lenses.
05:32 So that's what the point is here. So a value model really is composed of a job to be done and the end user or the group of people who are trying to get the job done.
05:43 We cover that. I give you some prompts to help you do that. The next thing that I do, and it's, it's optional, but the next thing that I do is I generally read contexts.
05:55 I think we can all agree that context is very important when we're studying things. I mean, at a universal level, it may, it may give us some good knowledge, but the context is really where we can take action.
06:10 And it may be really super important to a particular organization to focus on an area like that. So what I do is I generate 20 contexts, and then we can go through and make a decision, you know, either as independent research, or working with a stakeholder to figure out what do we want to do here.
06:28 Do we want to just analyze that universal kind of level job, and then just capture the context to see how different respondents in different contexts rate these, these metrics, or do we want to add the context into the job statement itself.
06:47 And what you'll see in my prompts is that you can, you can write the job statement, keep it the same, and then you just write a contextual clarifier in a different input, and it will generate a complete completely different map.
06:59 So if you have 20 contexts in your database and you run 20 maps, you're going to get 20 different maps because context is critical.
07:10 So once I've done that, we've made that decision. Then we can generate the, the job map and once we've generated the map, then we can generate the success metrics for each step in the map.
07:23 And that is the key piece of this this method, this and, and this accelerator tool is to really get those, those generated because they are very specific.
07:36 They, the format that I learned is very specific and that's the format I'm using in this course. There will be some other formats.
07:45 That's added to it because I mentioned in another video this format can be extremely difficult for some people to consume.
07:57 So once I've done all, once I've done the map and the metrics, then I'll go on and I will. Generate what I call situations like a situation table and we, we've called them complexity factors in the past, but essentially they are external.
08:17 Conditions, I won't say constraints, but conditions that, that you are trying, you're executing the job and if, for example, if I'm painting a house if it's very humid out, that is a situation or a complexity factor that will impact.
08:33 For instance, how long it takes to dry, one coat to dry. So it's good to know that, that type of information.
08:41 So I usually generate about 20 of them and then we put them in the survey and they help us to explain why different segments of users struggle differently.
08:53 One, one of the ways. Another is context and another is emotional and social jobs. So we, we take a look at how people want to feel or how they want to avoid.
09:04 Feeling or how do they want to be perceived or avoid being perceived when they're getting the job done. They're not, they're not helpful independently.
09:14 A lot of people will tell you when we call them emotional or social jobs that though there's different types of jobs.
09:19 There's one type of job, it's a functional job. The other types of jobs have to be analyzed in the context of that functional job to be of any use at all.
09:31 I know there's difference of opinion, especially with aspirations and things like that, but this is the problem space. We're on the front end of innovation here.
09:39 We're not in the design space. So I know that that's another term that gets bandied about and has different interpretations.
09:46 So you're just going to have to accept my interpretation for this because that's the way I look at this. Related jobs can be a little confusing.
09:57 When we're looking at what we call core functional jobs, which are really, they're above the journey level. They are really, you know, things that we want to try to accomplish like.
10:08 You know, get a, get, get somebody to Mars or something like that. There are steps just getting people to Mars that, that you need to study.
10:20 But you know, there are things before you get them to Mars that you might. You might have to do like build a viable platform, delivery platform, you know you know, maybe fuel options and things like that.
10:35 So they're, they, they are things that you, the jobs that are, related to this core job but they may be done before.
10:44 That job is, can get done or they're done afterwards or even during. When we, when we're talking about consumption jobs, consumption jobs equals journeys, think of journeys or the consumption chain in general could be both the customer side or the provider side.
11:03 So you've got two that have to kind of come together but anyway, the when we're looking at consumption jobs, the related jobs tend to look like steps in the consumption.
11:14 I'll just say that I'm not going to get into the detail of that. That's something that probably requires a little more thought but just that's what you're going to see so don't get confused by that.
11:28 We've got financial metrics and financial metrics are essentially, you know, related to the cost of a solution and the cost isn't just direct cash outlays when you purchase it.
11:42 It's not just about the purchase price. It could be other things that you may need to purchase during the life cycle of ownership, maintenance, services, whatever.
11:51 It could also be related to the performance of the product. So, if the product doesn't perform at the level that you need it to, that'd be opportunity cost or cost associated with making up for that lack of performance.
12:10 So when we get to field a survey, and I'm gonna, I'll probably power through some of these a little bit faster.
12:17 I'll just say this, there are a lot of different types of research that we wanna do and this model facilitates most of it.
12:27 It really does. You can put, you can plug just about any problem into it and, and, get out there. Get some really good results. But there is really no single way to do this.
12:37 We've got for those of you familiar with job mapping and outcomes that type of survey is focused on a single job and getting really deep on, on that job.
12:48 We improve solutions that help people get this job done and you know, that, that's pretty much what we, everybody talks about when they, especially in the outcome driven innovation world, but there's also at a higher level, there are platform considerations where a platform gets many jobs done.
13:10 So what we might want to do is understand what, which, what out of these 30 jobs, which ones are the highest priority.
13:18 And there's different ways to measure. That I mean, it's, it's got to be more than just importance and satisfaction. There are more attributes that you need to measure.
13:27 And that's, that's a topic for another day, but just know that we can, we can elevate this to the job level and just rate jobs themselves.
13:37 It's in between space where maybe you don't need to go as deep as a job map and outcomes. You need to get a quick gauge because you've got a, a leader who, who wants an answer now.
13:51 It's probably easier just to generate some some, some inputs into a survey that characterize the ideal in getting this job done.
14:03 So you might generate 15 what I consider to be like high-end capability levels. 15 things or 20 things or 10 things that people would consider the ideal state or states when they're trying to get that job done and that, that's kind of an in-between thing.
14:21 So you're going to see that in here as well. And that's what those are for. They're not part of a traditional jobs to be done research study.
14:32 It's more of a, it's just, it's demonstrating the flexibility here. We don't have to do it any certain way. Once that data comes out, you know, anybody that's done a survey has seen the flat file output.
14:53 That's not super useful if you've ever seen a. Real complex jobs to be done study with a job map and a lot of dimensional data that we're capturing is an extremely complex output, even though it's a flat file.
15:13 And it takes a lot of work to, to put it, put it into shape. Some people have tools to do that and some people don't.
15:21 They have to struggle through it. So that is definitely a place where more work needs to be done and it is being done to, you speed it up.
15:29 But at the end of the day I mentioned the star schema. I believe you know, what you, what you're trying to do is get the data in a format and that's just one possible format that allows you to slice and dice it.
15:44 Prioritize it segmented in a simpler fashion. So that, that's what we're trying to do when we get to that point.
15:57 Once we're there, we want to analyze the data. So our goal here is really to, to know something. We want to know with certainty where the value gaps are in the market or in a, whatever, or a journey or experience.
16:12 Other methods are really kind of guesses. That's why the shelf life is so short. On them, I, you know, when I was at T-Mobile, I, I, you know, I saw people pitching, you know, different variations of journey mapping research including from Bain and, and it was hilarious because they, the people that
16:31 were or purchase. Purchasing this or potentially purchasing this, we're getting sick of it. You know, over the last three or four years, I think they had done like six of these studies and it's like, why are we constantly redoing these?
16:42 This will kind of solve that problem because, as I said, you know, the these models are very stable. So they give a good depiction of the market or of a journey.
16:55 So, and because of all the metrics and all that, we, we're, we're really capturing a lot of data when we field a survey and we're able to slice and dice it.
17:04 In different ways. And, and this data is very forward looking. And I'll just say from my experience, I've seen market research done, you know, the old way, I guess you know, where we're, constantly dealing with hypotheticals and things like that.
17:26 And I've seen it done this way. And this way always wins. This way would have told someone like my former employer what they shouldn't have invested in.
17:36 Five years ago. They were constantly making bad investments because it was just some leaders idea and, and this approach will validate or invalidate it or it will give you the, the medium through which you can ask questions in a new way to help validate.
17:52 Or invalidate as maybe even many experiments. So you can, you can do more than just study this main job. You can throw little experiments in as well.
18:00 But the concept here is that this data is, is, is hard to beat. It really is. And I know most of you probably haven't gone this far with it.
18:09 But this is where we, we're trying to go with when we build these value models in the first step. And once we have all of that data and it's analyzed, now we're, now it's time to, to try to formulate strategies.
18:28 You know, strategies can be, you know, we can have a different strategy for different segments. So we're, if we're doing segmentation based on needs, we may have an over-surge segment.
18:40 A couple of different underserved segments are all going to have different strategies tied to them. And then, of course, you've got got to consider the fact that you might have strategies that are targeted at products or portfolios or services or, or marketing methods.
18:57 So you should be considering all of those things, and this type of data allows you to do that, and also to, to do that in comparison to competitors.
19:08 So instead of, instead of, the problem with doing competitive analysis is that everybody kind of promotes different features and functions, and what this does is it baselines everything, so everybody's being compared against the same set of performance indicators, and it makes it very, very simple to
19:24 , to, to do those types of things. And then ultimately we gotta tell a story. So again if we can eliminate some of this qualitative interview stuff, I could, I know for me I can learn how to tell a more compelling story.
19:45 I'll have time to do it. They're important because executives, they know that, you know, a lot of them have been in sales and stuff and they like these emotion, they like to emotionally connect with people.
19:56 So if you can give them a, a well-defined compelling story that's got some emotion in there. And that, you know, that may require that you go back to segments after a survey and do some, some one-on-one interviews there to help pull those verbatims out because we're not gonna, even if you do human interviews
20:15 on the front end, you're not gonna get verbatims because those people didn't take the survey. You don't know what segment they were in.
20:23 So anyways, there's a lot of pieces that go into this and it's just another critical piece. You need to sell this to your, to your stakeholder.
20:30 That's what they're, they're paying you for. That's the final piece of the puzzle. And I put this big jobs to be done versus the little jobs to be done.
20:42 There's multiple schools of thought with regard to this, but this, what we're talking about here, this is the big jobs to be done.
20:50 Okay. It's been around a long time. It's been tested over thousands of projects. Some of these other approaches have not been.
21:00 And, you know, I would, I would, I would say that anything that's focused simply on the qualitative interviewing process and then when you get done, you say, now what?
21:12 That, that's, that's a little job to be done because it's not getting you there and you're going to end up having to redo it.
21:17 You're going to have to do those interviews again. And I, I get it. If that's what you like. To do, you're probably happy with that, but that's not what this is about.
21:24 What this is about is getting past that so that we can do a more rigorous type of innovation study so that we're not constantly failing fast because that's, that's just not productive.
21:37 We don't need to fail fast. We can get it right far more frequently if we just generate the, the correct kind of data and get, prioritize it properly in the market.
21:47 And that is what this this approach using AI does. It helps us accelerate our way through all the mundane things that really soak up time on your calendar.
22:00 They may not take a long time, but they, they're just tough to schedule up, you know, interviews and things like that.
22:06 Get through that and get to the field with a survey. In hours instead of weeks, or maybe even months sometimes.
22:14 So that's what the, that's what we're trying to do here. And as you'll see over time, we're going to tackle the rest of these.
22:20 So stay tuned for that. And if you, you know, have any questions, feel free to leave comments in the, in the course itself I've turned comments on.
22:30 And at some point we'll have a community on here that's coming, hopefully in the next few months. And we'll be able to help each other move forward.
22:40 So, have fun!
© 2023 Practical JTBD, LLC. All rights reserved. This work is provided for the exclusive use of authorized individuals in industry innovation and strategy roles and may not be reproduced, distributed, or commercially exploited in any manner whatsoever. Unauthorized use is strictly prohibited and subject to legal action.
Yes, I'd like to receive your emails. Please add me to your email list.