Episode 13: The Riskiest Assumption Test
Have you ever spent a bunch of time and money building an MVP and it still falls flat with customers? What if you could test out those assumptions before you even build an MVP? You can – it’s called the Riskiest Assumption Test.
For short, we call it a RAT (which Jim loves to say) and it’s a usually quick, usually inexpensive way to test if the things you think are true about your product are actually true.
Like they say – there are no truths inside the building – only opinions. The RAT allows you to get out there and see what’s actually true for your customers before you invest serious resources. Jim gives some great examples of Riskiest Assumption Tests that will get you thinking.
Lizzie Williams: Hey, everybody. On this episode of 10,000 feet, we talk with frequent contributor and CIO. Jim VanderMey about a topic we get asked about a lot: how to validate your idea before spending tons of money? Jim introduces the idea of a riskiest assumption test or rat instead of jumping right into building a minimum viable product known as the MVP. Jim gives some great examples of classic assumption tests on this episode that may give you some useful ideas. Enjoy!
Jim VanderMey: So I think that one of the problems with the MVP is the term “minimal”. And so people tend to think of—it’s a stripped-down, it’s our lowest cost of entry. But if you look at the Lean Startup literature, I was reminded of something that Eric Reese wrote in The Lean Startup is an MVP is that version of a new product, which allows a team to collect the maximum amount of validated learning about customers with the least effort. One of the challenges that we have with these, our established enterprise brands is that in the era of five-star ratings, minimal still has to be really good.
Andrew Powell: Which isn’t in fact minimal at all.
Jim VanderMey: Right. And so when you have these pressures as a market leader and we’re saying that you have to do an MVP, but you’re also recognizing that the success of your product is going to be based on its Amazon ratings, then minimal takes on a new meaning. So as opposed to a pure-play digital product where you’re developing a mobile app or a web service that minimal can be something different because the rebranding relaunch and the iteration is much simpler than when you’re manufacturing something in China and have a significant capital investment in tooling and product.
Andrew Powell: So, Jim, what you’re describing doesn’t sound like an MVP at all. It sounds much more like a version 1.0. A fully polished released product.
Jim VanderMey: Yes.
Andrew Powell: Is there an MVP that should have come before that instead of cart before the horse situation?
Jim VanderMey: And that’s why we talk about the rat, the riskiest assumption test.
Andrew Powell: Yeah, say more.
Jim VanderMey: So the rat as the riskiest assumption test is identifying what is the hypothesis that have proven false would put the product at risk. And so for example, we have one client that they were wondering if people would actually want to get the consumables for their product online, would they order through the mobile app? And so they have a product that every once in a while, I has to have fulfillment of a consumable device, a filter in this case. And they were wondering would people actually order the filter when the product tells them that they’re at a certain point in the life cycle of the filter.
Andrew Powell: So they’re testing whether the products saying, “give me a new filter” is motivation enough for the customer to buy a new filter.
Jim VanderMey: Right. But in order to test that historically, you’d say, well, we have to completely integrate that with our backend e-commerce platform
Andrew Powell: Right. To get the order.
Jim VanderMey: To get the order through. So what they did is they put the button in the app, sent an email to an intern. The intern would actually go out to the warehouse, get the part and ship that to the customer. So what they were doing is with a low-cost experiment, they were validating that this particular activity would be something that consumers would value. And then they also were able to learn through that at what stage in the notification process and what notifications and is it—
Andrew Powell: Right, right
Jim VanderMey: Do people actually buy it after the product has expired when it’s fully consumed or did they buy it in advance when a notification pops up? But instead of doing the full integration on the backend, they were able to do a test through and validate that hypothesis at a low cost.
Andrew Powell: So then when they make their investment, they make an investment fully informed and how they know their users are using this function or likely use it.
Jim VanderMey: Yes. And so that is the value of the rat process is that there’s an economic opportunity or there’s a process opportunity that there’s something that you believe to be true. And sometimes those things are proven false really quickly through a design exercise. We have done rat validation in one afternoon in some cases because someone said, “yeah, people want to engage in our product in this way and they’re going to”—and we realized, no, they actually don’t want to engage with your product that way. You can’t create a compelling experience around this particular activity because people who are very close to products are oftentimes too close.
Andrew Powell: Right, right. Yeah. They see what they want to see in their product. So, Jim, we jumped in right in the middle. I want to try and go back to before the product. Is there a rat situation where we’re trying to evaluate whether or not we should even build something?
Jim VanderMey: Yeah. Is there enough value in this effort? Is the problem big enough to solve, right?
Andrew Powell: Right.
Jim VanderMey: So is this a problem that warrants the taking out of the complexity? Of adding—we run into these problems a lot in the IoT space specifically because we’re asking a consumer to take on digital complexity, potentially security risk, and a software life cycle management for the sake of some perceived benefit of the connected product.
Andrew Powell: Right, right. And sometimes those three things you just listed are really expensive things that changed the course of the business. And you’re now—you’re also putting an onus on the consumer, so to move beyond the hobbyist market with a general consumer get value from this.
Jim VanderMey: And, you know, you look at you know, smart cleaning products. You look at home security products. You think about IoT in certain industrial applications and some of the very first questions that come up are, does there enough value here to warrant the additional investment?
Andrew Powell: So how do we answer that question without actually building it?
Jim VanderMey: So we can create tabletop exercises? We can create design documents. We can create mock-ups. And then ask people to interact with that and do guided research. And so a subtle shift in the MVP model versus that rat model is that you learn before you build with a rat. With an MVP, you build to learn. And so the cycle is, you know, we build, we learn in an MVP, but in a rat we learn and then we build.
Andrew Powell: So does one lead naturally to the other? Do we rat and then we MVP.
Jim VanderMey: Yes. Complimentary, absolutely complimentary. But again, recognizing that the bar is very high so whether the product, the experience is at risk and what can you easily test to do that validation early in the process. And then that could lead into building the MVP, recognizing that for these physical product companies, these legacy manufacturers, that an MVP is actually a version of one of the product that has to be commercially successful and have enough success in the market that your channel, your brand promise and that the physical product itself is desirable.
Andrew Powell: So Jim, sometimes in the IoT space, do you run into this? The reason we need to build it is because our competitors are building it.
Jim VanderMey: I just had that conversation this morning with a customer.
Andrew Powell: So how do you validate that?
Jim VanderMey: So they’re saying that our competition is doing something. And they’re getting ahead of us. And it creates a sense of urgency because the company that might be the market leader may be at risk of digital disruption. And so you’ve got a digital innovator coming in and saying that “you know what we’re going to create X, this new product that does this really cool thing, and that’s going to disrupt our market.”
Andrew Powell: Right.
Jim VanderMey: And so the incumbent then is faced with a really great question about what they should build to compete against that perceived threat. And the problem that most incumbents have is that they have huge resources, but the innovative product is likely going to be very small at the beginning. And so how do you validate that this new experience or this new product or this new service is going to have value in the market? And so, we did a project for caterpillar where we went out and we prototyped in conjunction with some of their key channel partners and key customers. And we were building mock-ups and saying if we did it this way, what do you find value in it? And we were able to rapidly iterate through a very stripped-down asset, but to identify is this experience valuable to you? And we were able to determine that some things were. And some things absolutely were.
Andrew Powell: So through that iteration, you end up with a sort of mock-up that you feel like the customers are saying “yes if you gave me this I would like it.”
Jim VanderMey: Yes.
Andrew Powell: That then naturally leads to an MVP process. Does that give us a higher likelihood that the customer will actually like it when we build it? Because sometimes what I think I want, isn’t what I want.
Jim VanderMey: Right. And that’s the avoiding the perceptional bias of the builder. And which might be the riskiest assumption that perceptional bias the builder is that this is something that is valuable to the market. And how do I then validate that and test this hypothesis? How do I construct an experiment to confirm that? A true experiment that allows for the possibility that the answer is no. Because we’re not trying to simply validate our hypothesis. We’re trying to test our hypothesis. That’s a subtle nuance, but it’s important.
Andrew Powell: Absolutely. I can’t help but think as consultants—
Jim VanderMey: There’s an aspect of concern there. So what we’re really saying is give me some money and I’ll confirm for you or tell you straight that this thing you want to do isn’t going to work. Yes. Right. Like part of what we’re doing is before you spend a million dollars with us, great.
Andrew Powell: Before you spend a million dollars spend 10,000 and we’ll tell you if it’s a good idea or not, or spend a hundred thousand and we’ll validate the idea.
Jim VanderMey: Yes. That’s exactly what we do.
Andrew Powell: Yeah. Yeah. That’s great.
Jim VanderMey: And because I don’t want to build products that nobody wants to use, neither do your clients.
Andrew Powell: Right.
Jim VanderMey: And so as consultants, we have to say that our goal is the customer success, not in taking money to build something. That’s a bad idea.
Andrew Powell: Right, right. Which is why I asked you about while our competitors are doing it, is there a case where it is inherently a good idea because that’s what everyone else is doing? So is that fool’s logic?
Jim VanderMey: So you may choose to be a fast follower and say, we’re going to strategically choose to be the fast follower because we’re going to allow the market to test whether these are good ideas or not. And we know because we have the brand and we have the scale and we have the product knowledge that we can be a fast follower and we can then change and do something better than our competition and learn from their mistakes. And so that’s a willful choice of strategy to be a fast follower. So you look at certain product segments and there are niche companies that have introduced products, and then you’ve seen the dominant brand come in second. Take, for example, hybrid vehicles, you know there’s a real tension in the world right now about can Tesla scale or will other companies learn from what Tesla has done and replicate the process at scale and make Tesla functionally irrelevant in the automotive market.
Andrew Powell: And probably some of the big players in the auto industry are thinking exactly that. Tesla paved the way they sort of proofed out the idea of driverless cars and electric engines and that sort of—
Jim VanderMey: High capacity batteries and charging stations.
Andrew Powell: And now GM has the benefit of all of that experience and knowledge and the scale that Tesla has to build to be able to manufacture cars much more quickly in the Tesla model.
Jim VanderMey: Yes.
Andrew Powell: It sounds like cheating. It sounds like competition.
Jim VanderMey: Sure. Yeah. So is the value of being first to market. There were MP3 players before there was the iPod.
Andrew Powell: Right?
Jim VanderMey: Yeah. iPod wasn’t the first MP3 player. It was just the one that got it all right. And they also, because they were in a place where they were able to create a business model to go along with it. And that’s the other part of this is that there’s a business model test that you can execute as a riskiest assumption test as well. Is there a business model that we have to support through this new innovation that is going to change or create a new value proposition in the market?
Andrew Powell: Right. So Jim, how would I even know what my riskiest assumption to test is? How would I get there?
Jim VanderMey: So one of the best tools for that is using a business model canvass and in the business model canvas, you are writing down the state of the current market. You’re writing down what our value proposition is. You’re talking about the channel. You’re talking about the competition. You’re talking about risks. And in the business model canvas, as you work through that, you’re able to pretty quickly tease out here’s the three or four things that have to be true for this to be successful. And then you create the hypothesis related to those and say, “well, how would I would test these things”
Andrew Powell: Right.
Jim VanderMey: How would I do an experiment? Who do I need to talk to to see if this is actually something that’s valuable or not? Yeah. And one of the hypotheses might be, can we create a service around this that people will consume? Because maybe we’ve always sold our product as a discreet product sale and now we’re going to do it as a service.
Andrew Powell: Sure, you know, are people willing to buy my product as a service? So you might have three or four rats in this scenario you just find, and you test those sequentially? At the same time?
Jim VanderMey: They can be tested concurrently. I would say depends on the dependence between them. But it also depends on the team, because you might say we’re going—why would we do rat test number four if rat test number one is going to be the one that’s going to validate or invalidate the entire program.
Andrew Powell: Oh, sure. So you might have the riskiest of your riskiest assumptions and say, well, if this one is not true, then nothing else matters.
Jim VanderMey: Right. And you can iterate very quickly through these experiments. And say, well, if this experiment is proven true or false, then I have a next hypothesis.
Andrew Powell: Sure.
Jim VanderMey: And that also then creates a strategic framework for the product because this is what we’re now trying to achieve.
Andrew Powell: Great. Okay. I gotta go off to the left or right. I gotta take a little detour here for a second. Am I the only one bothered by rat? Like no one thinks, oh, rats, they’re so cute and valuable to business. No one thinks that.
Jim VanderMey: That’s one of the reasons I love the term because the moniker itself is so memorable.
Andrew Powell: Okay, alright.
Jim VanderMey: And because there are so many acronyms, our industry, Herman Miller, for example, they got tired of the term MVP. And so instead of—they said, why should we do anything minimally? We want the maximum. Why do we want the viable because when we talk about something being viable, it’s on life support? And so we want valuable and they said, everyone’s talked about the product, but we want to talk about an experience. So they would talk about them maximally valuable, maximum valuable experience they could create,
Andrew Powell: The MVE
Jim VanderMey: The MVE or they might talk about the maximum achievable experience given the constraints that they were facing. So we get caught up in these acronyms and they take on meaning in some cases outside of the definition and the acronym itself. But the rat is very memorable. And it also is kind of fun to drop that in a meeting say, well, before we do this, before we go forward on this development project, we need to do a rat. And immediately it gets attention.
Andrew Powell: I can see that. And to that, not all positive attention, I imagine.
Jim VanderMey: Well, Greg, who is a senior executive of Bissell says that they now have the rat as part of every one of their product development discussions and so that has become part of their definitional standard and so they will thoughtfully include that as part of their process now. And so I think that because it might not be warm and fuzzy, it might not be, you know, as you said, rats, don’t have a lot of positive button. It works for us. So, always, every new engagement, you talk through what we’re going to do and one of the things that you’re asking yourself, and you’re talking to your clients about is what’s your riskiest assumption? How are we validating that? I absolutely will bring that up. And there are enough products in the junkyards of digital development, things that were highly anticipated that people don’t use. And so the fact is that I could say to you, how many apps do you have on your phone that you loaded, and you used once or twice, and then you never used them again?
Andrew Powell: Most.
Jim VanderMey: Yeah? And how many smart home products or smart products do you have that you purchased thinking you were gonna get a certain amount of value from it and you didn’t maintain it. And so, there are places of resonance that people will have, and that you don’t want your product to do that.
Andrew Powell: Right. Right.
Jim VanderMey: And so if you don’t want your product to do that, then what can we test quickly that will enable you to have a product that people will engage in, be passionate about, give a five-star rating and create a strong economic opportunity for you.
Andrew Powell: So does that create a sense of obligation for you, Jim? So we validated this. We went through this rat process. We determined that the marketplace wanted the product to be developed. Now we developed it and it still doesn’t get five-star ratings and nobody actually wanted it. Do you then have this sense of ownership in that decision or is there a process by which we re-examine our assumptions along the way?
Jim VanderMey: We absolutely re-examine our assumptions along the way. We have to learn from those early-stage development efforts and that’s where the MVP then comes to play. That we are going to learn and then we’re going to iterate and then—
Andrew Powell: So the one leads to the other.
Jim VanderMey: So we create a loop and we constantly are learning. And that’s where the data is so important. That the data that we collect about the use of an application, the data we collect from the device if it’s a smart connected device, is that it might be that your version one is something that you create to collect data and
Andrew Powell: Sure. To inform your future versions.
Jim VanderMey: To inform your future versions. And you are creating an MVP, you’re creating a version one to learn so that you can be successful the next time, but that’s then creating a product roadmap.
Andrew Powell: Right.
Jim VanderMey: And so and the other interesting part is that there are now analytics you can look at. We did a project with one client where we ran analytics against all of their competitors in the public marketplaces to identify what keywords were associated with five-star versus one-star ratings. And so now we’re using data to inform what features should be at the top of the list in order to get highly positive ratings.
Andrew Powell: Right, right.
Jim VanderMey: And what features do we apps—because it’s actually more important to not get one stars than it is to get five stars because a one-star will drag you down. It’s more damaging and more damaging just from the way the averages work.
Andrew Powell: Right.
Jim VanderMey: So, there’s what do we need to avoid doing to keep from getting one-star products?
Andrew Powell: Just make a better product. That’s all. It’s easy.
Jim VanderMey: That’s all?
Andrew Powell: So Jim, can we talk about how this applies or if this applies outside of durable goods and connected products?
Jim VanderMey: Oh yeah. Yeah. So there’s service offerings. We have a client who’s in the transportation industry and they had an idea though for a service offering. And what we suggested is before you go into that conversation about your service offering, let’s just talk to customers. And I had this in a healthcare situation as well. That the more people know, the more they assume they know.
Andrew Powell: Right, right.
Jim VanderMey: And so I call this, how do we avoid smart people mistakes? And it just requires a gentle nudge, but how many times have we worked on projects where they haven’t talked to customers early in the development process?
Andrew Powell: All the time.
Jim VanderMey: And so this could be around a service offering. This could be around a new website. This could be around a e-commerce platform and the customer, our customer, the product owner company assumes they know everything they need to know to create a great customer experience.
Andrew Powell: Oh, right. I think that’s even—that’s the standard. They approach the project presuming they already know what they want.
Jim VanderMey: Right.
They already understand the customer and the customer’s need that’s why they’re talking to you. That’s why they’re talking to a consulting firm.
Jim VanderMey: And so I think in that moment, to say, have you validated this and with not in the sense of a small group or a focus group, but have you gone out and observed people solving this problem in the wild that you’re trying to address? And can we inform an understanding of the economic value quickly right before we start building? So have it just be a little more learning and I think that’s something that any web developer, any mobile app developer, because are you really thinking that no one has solved this problem before. Cause they’re solving the problem manually and what you’re fighting against isn’t another product. You’re fighting against the status quo.
Andrew Powell: Right.
Jim VanderMey: And if there’s not enough pain in the status quo to take on your product and persistently use it, then it’s likely to not be adopted at scale.
Andrew Powell: No matter how great it is.
Jim VanderMey: Yes, because you’re competing with the status quo. You’re competing with good enough.
Andrew Powell: Fascinating. So do you use this to pick what restaurant you’re going to, what you’re going to have for lunch?
Jim VanderMey: No, I don’t. I tend to not use a lot of risks when I’m picking what I’m having for lunch. So no I tend to avoid rats for lunch,
Andrew Powell: So I want to circle back around and just try and sum some of this up. So I use a riskiest assumption test to validate the thing we think is true, really is true.
Jim VanderMey: Yes.
Andrew Powell: That informs the way we approach building our minimally viable product. In the end, then we get a product that we feel more confident, resonates with the customer, so that— no, we get a product that resonates with the customer more strongly. That’s what we get, right?
Jim VanderMey: Yes.
Andrew Powell: We feel more confident about it too. Is the process longer? Is the process more elaborate?
Jim VanderMey: The process is actually faster.
Andrew Powell: Great.
Jim VanderMey: That’s one of the interesting side effects because in a traditional agile development process, you don’t get the feedback until you have the build. You don’t get the feedback until you have high fidelity design artifacts. And so in the rat, you create something with the minimally necessary to validate a hypothesis. So it’s only one small piece of the total product or service. So you get feedback a whole lot earlier because you’re getting feedback off that tiny little piece. You’re getting validated learning early in the process which then can inform the design process and given that oftentimes the things that are riskiest are the things that are hardest to design and to build, you are shaving a considerable amount of time from the development time and budget. And because you’re giving more confidence and certainty to the design team and the development team. So it actually is faster this way. So we hopped right into the subject at hand today, Andrew, and didn’t get introductions. So who are you?
Andrew Powell: Oh, geez, I don’t know. I’m Andrew Powell. I work for your company. Yeah, I manage our application development practices. That’s what I do.
Jim VanderMey: That’s what you do. And Andrew have some of the most fascinating hobbies of anyone I know here at OST. So what are some of the things you do in your side gigs?
Andrew Powell: What do I do in my free time? Before I cover what I do in my free time, Jim, I just want to say thank you for acknowledging my fifth anniversary, which happens this weekend. Yes, I’ve been at OST for five years. In my free time, I do everything, Jim. You know, Laura and I were talking about this earlier. I think I’m an entertainer. That’s what I am in my free time. I do magic and I produce shows and I write plays and I write books and I think that’s everything.
Jim VanderMey: You do comedy?
Andrew Powell: I do comedy. Yup. I do some standup and some improv, mostly as characters. Yeah. Yeah. I think that’s it. My wife owns a comedy club. I should explain. It’s just down the street from OST. It’s called The Comedy Project. They opened a few months ago.
Jim VanderMey: So I love the fact that you are a person of diverse interests. And you bring that energy into the application development team as well as into your personal life. And one of the things that we do in this podcast is because OST is in an old game companies. We ask newcomers to the podcast, what is their favorite game and why?
Andrew Powell: Oh boy. What is my favorite game and why? It doesn’t have to be a wooden board game. I’m going to answer the question, but before I do, can I tell you a story, Jim? Can I tell you a story about this building we’re in? It’s where my mind went when you said this. When I was a young boy growing up in Grand Rapids, Michigan, we used to drive through downtown Grand Rapids and there was that giant chess piece on the side of the factory.
Jim VanderMey: Yes.
Andrew Powell: Yeah. And I used to tell my parents that when I grew up, I was going to work there cause I wanted to make games. I make games to them. I’m a game hobbyist. And it’s ironic to me as I think through my five years at OST that I am in fact working at that building I wanted to work at
Jim VanderMey: That’s fantastic.
Andrew Powell: Not the same building or no, the same business, but I’m working there. What’s my favorite. That’s the real question you asked. That’s that’s tough. I’d probably say my favorite game is poker and I feel bad saying that because do I like poker best just because I’ve spent so much time playing poker. One of my hobbies is that I used to play poker semi-professionally, I don’t know. I don’t know, but I do love poker. I love poker because there’s an aspect of poker that is most definitely luck. But there’s an aspect of poker that is skill and anyone who plays poker can get better at poker just by playing more poker, but also knowing people better, knowing who you’re playing with better and understanding the inner dynamics of human communication make you a better poker player. So it covers the full gamut of human experience in a way that is meaningful to me, even though that sounds so hokey now that I’ve said it.
Jim VanderMey: Well, and you are also able to put that to good use because every year on our March madness party, you are actively engaged at one of the poker tables that we have going on upstairs.
Andrew Powell: That’s true. I did deal poker for eight hours straight last March and the March before that.
Jim VanderMey: So, all right, well, thank you. This has been a good day today, so.
Andrew Powell: Thanks, Jim.
Lizzie Williams: OST, changing how the world connects together. For more information, go to ostusa.com