Test the Untestable: 3 Principles for Smarter Innovation Decisions – 0015

March 26, 2019

Driving Eureka! Newsletter #15

This is the third episode of the Driving Eureka! Podcast. Segment 1: Test the Untestable: 3 Principles for Smarter Innovation Decisions; Segment 2: Data Driven Decision Making; Segment 3: Brain Brew Whisk(e)y Academy.

Subscribe to learn how to Find Filter and Fast Track Big Ideas.


Show Notes

The Driving Eureka! Podcast

Episode 15

Feature. Article – Test the Untestable: 3 Principles for Smarter Innovation; Driving Eureka! Book Segment – Data Driven Decision Making; and the Brain Brew Whisk(e)y Academy

Test the Untestable

Need Help with Innovation Data Decisions? – Merwyn Decision Support Group

85 – 95% Failure Rate in Innovation

Merwyn Truth Teller Can Help You Make Better Decisions

Download Truth Teller Paper

Win More Than You Lose

Hitting .500 in Innovation vs .150 is like Baseball

Make Money by Having Better Data as You Move Through the Development Process

Ego Can Lead to Bad Decisions

Innovation Requires Learning – Not Know It Alls

3 Principles for Smarter Innovation

Dealing with Uncertainty in Numbers

NIFS Validated Forecasting System

New Research You Need to Forecast

Triangulate on Decision Support

Driving Eureka! Book Segment – Problem Solving with Data Driven Methods

Innovation Without Data is Stupid

Problem in Innovation – One Test in the Beginning and Then No Tests as Product or Service Develops

Numbers are Our Friend

Sensation Transfer

Paired Comparison Test vs, Sensation Transfer

Merwyn Decision Support Group

Better Innovation Decisions – 3 Step Process – Cuts Costs 70-80%

Brain Brew Whisk(e)y Academy

New Data – New Learnings

Red and White Wine Drinkers Like Different Whisk(e)ys

Using a Correlation Matrix

18% Drank Whisk(e)y in Last Month in the USA

3.6% Drink Whisk(e)y Neat

Introducing the Whisk(e)y Wizard

You May Determine a Whisk(e)y by Beer or Wine You Drink

Craft Cocktail Recipe – The Bourbon Milk Punch


Step 2

Step 3

Step 4




Tripp: [00:00:01] Welcome to the Driving Eureka! podcast where we share ideas and advice for helping you find filter and. Fast Track. Big Ideas


Tripp: [00:00:14] Hi I’m Tripp Babbitt advisor to global organizations on the Deming philosophy and host of the Deming Institute podcast.


Doug: [00:00:23] And I’m Doug Hall inventor speaker teacher and whisk(e)y maker. I’m also the founder of the Eureka ranch and author of the driving Eureka book.


Tripp: [00:00:34] This is episode number 15 of the Driving Eureka! podcast. The discussion is about Doug Hall’s newsletter from January 31 2019. You can read the[00:00:45] LA [00:00:45]newsletter by going to Doug Hall dot com on the menu click newsletter.


Tripp: [00:00:51] This week’s feature article is test the untestable three principles for smarter innovation decisions the Driving Eureka! book segment is about data driven decision making and the Brain Brew Whisk(e)y Academy will discuss the topic new data new learnings and of course we’ll talk about the craft cocktail recipe.


Tripp: [00:01:15] So let’s start with our feature article Doug. It’s test the untestable three principles for smarter innovation decisions. Why did you start down this path. Well


Doug: [00:01:29] You know you go to a company and you’ll be meeting with small companies medium large and at some point they’ll lean in and they’ll say you know we’re a very conservative company. I said oh that’s surprising.


Doug: [00:01:46] I mean I’m waiting to hear the company. We’re not very conservative yet.


Tripp: [00:01:51] So what did they mean by conservative. Well what did they answer that.


Doug: [00:01:55] We don’t do rash things okay. And we practical and prudent as we do it. I would joke the fact of the matter is is that most of them are actually high stakes gamblers. Well the reason I say that because that may seem harsh is I say so that would mean you use data to make decisions. Oh yes we’re very data driven. OK. So when you made the decision to do that new system product whatever it might be What data did you use. Well while we didn’t you actually use data on that that’s what you mean you used it well or we can’t we couldn’t get data. I mean that was just not measurable. And see this concept of the that it’s unmeasurable is something that you know quite frankly 10 to 20 years ago that was true you know you there were many things that you could not quantify. And so they were right. The fact of the matter though is is that as computers and as one who had an apple to see up as computer power has come and as new methods have come what’s what we’re finding is is that there is nothing that we cannot provide data on now. So it is possible as we say I’ve got a team at the ranch which is called the Merwyn Decision Support Group where we sell services to support and I’m going to tell you how to do it and we can also help you if you want but I’m going to lay it out. What we found is that you can test the untestable. I mean that’s the slogan we use with people and we have to be that blunt to get through it because people have come to accept I can’t make a decision.


Doug: [00:03:32] So you know I oftentimes joke I say you know you know you have CFO so you don’t support innovation. Well guess what. That’s because they’re saying they’re reasonable people you know data after data shows that when it comes to it an idea. Now I’m just talking about ideas to go to market. And it’s also internal innovations as well. It’s the same thing. But when we talk about ideas that go to market 85 to 95 percent of the time the idea is still not in business. Two three four years out it fails the first year they may spend a lot of money to get it going but fundamentally it just withers away. Small businesses. They don’t go away. They just keep it down at a small level and they just kind of runs out. But to say that yes that has improved shareholder value that has helped the company grow. We’ve got an 85 to 95 percent failure rate. So when the CFO says I don’t think we should do it they’re right they’re right the nut case is the one who says we should go for it anyways even though we failed every other time we’ve done it this time it’s going to be different this time is gonna be different. Now the fact of the matter is is it is right to be conservative to be conservative with shareholder money.


Doug: [00:04:49] It is right to be prudent but just saying it isn’t doing it. What you have to do is you have to become data driven. In fact being data driven is is the fundamental to not only increasing speed because when you have data driven you get into less debates and when you’re data driven you make less bad decisions. And in fact just as an example we have a research tool called truth teller and it’s a unique system we now have to get into the details of it. But fundamentally what the system shows in the long term tracking studies we took decisions that were made by senior executives at that Fortune 100 companies and then we had Merwyn’s the Truth Teller systems evaluation. Go or no go. Basically should they do it or not. Seven times more likely to make a smart decision either to kill it or to go with it. When you used the truth teller system versus human judgment seven times that’s not seven percent. That’s seven hundred percent OK. And so you know you start to get into that five to 15 percent you can see how you can get over 50 percent you can win more than you can lose. It’s very doable. It’s really doable.


Tripp: [00:06:09] Ok. So. So you’re taking basically this 85 95 percent failure rate and now let’s say the failure rates because you said it was what wasn’t your success rate is is going to be what say 40 percent.


Doug: [00:06:25] I don’t know. Oh 50 percent. So. Oh my goal was always to win more than I lose. OK. And then. Because I can get to 80 or 90. But the way I get to 80 or 90 percent on each idea is I make them less unique.


Doug: [00:06:40] And so I don’t get the return I don’t get extra pricing I don’t get extra margin I don’t I don’t make more money so much more than 50 percent on innovations. When you’re really trying to you know grow your audience make more money because you know if you’re not unique you better be cheap. Much more than 50 percent and I think you’re just you’re playing it safe.


Tripp: [00:07:02] Ok. So let’s just take 50 percent for you and I guess you tell me if this is a good analogy or not but you know if I’m a baseball player and I’m hitting you now point .0 5 0 or .150 then I’m not a very good batter but if I’m hitting 300 I’m awesome and what you’re talking about is basically hitting 500.


Doug: [00:07:26] That’s right. And the key to that and the key to making money on that is that you have better data all the way through the process. So data is not at the start. The most important data is through the development process and as you’re going to market to see how things are going on so that you can make midcourse corrections because remember set it and forget it. Create the idea and just go. That’s delusional because as you go through the process of development and as you get into the marketplace things change. OK. And so competition adjusts. You’re trying to engineer something the technology doesn’t work. You have to make adaptations the biggest stupidest thing in the world is to test the idea at the front and never test again. In fact one of my ways to understand the success of a company is I don’t even have to know their success rate if they just tell me. But this idea that went out what test did you do what data do you gather on the. But the performance of the idea of the appeal of the idea. And they said well we ran a test in the beginning. OK. So show me that thing.


Doug: [00:08:32] And they’ll show me like a write up or a mock up or a prototype or a CAD drawing or whatever it might be. And I say Now let me see what you actually did and of course the two don’t have anything to do with each other. Does it changed. I said Well when you made those changes did you test. Oh no no no no you that’s not testable. Well yes it is testable. Yes it is. The new methods allow you to do it at literally you know 10 percent of the time in cost. We can do that. OK. So.


Tripp: [00:09:03] So in some of the neuroscience research for instance that I’ve been doing there’s a there’s a book it’s called Decisive and some of the things that I’ve read. There’s four things that real a block in a leadership team and these four things are and help me to help me figure out once I get through the four how this helps mitigate these four things. So one is narrow framing so they’re not looking at enough options they just have you know it’s kind of a go no go type of thing. We got one idea we’re going to either go or no go with it.


Tripp: [00:09:41] And I think you kind of hit upon it already but we’ll go back and revisit it. There’s confirmation bias which is then you’re only looking for things that support your idea when you’re doing your research. Emotion plays a large role in the decision making process with executive teams and then overconfidence in their idea.


Tripp: [00:10:02] So how does this system when if step back and look at those four things the narrow framing a confirmation bias emotion and overcome confidence overcome these these difficulties that that seemed to manifest themselves in almost every organization.


Doug: [00:10:21] Well I’d put all of those away I would gather all of those as it’s called ego OK I mean it’s fundamentally ego OK it’s the narrow focus is I already know what it is and I already know what it is. So I don’t need to see anything different. You know I love it. So don’t you love it. And and of course I’m right because I’m right because I’m the big boss and I wouldn’t be able to do this if I wasn’t the big boss.


Doug: [00:10:47] I mean that’s the you know that’s called hippo the highest paid person opinion as they call it you know and and and so it’s just different ways to say that they don’t have a culture of learning it’s a Mr. Know-It-All or Miss Know it all. It’s a it’s a person who feels that they know the answer and they don’t need anything else and that’s fine. And so that’s why she financial officers don’t want to do it. And I by the way I hope that as a shareholder of your company I hope you don’t because you’re actually going to lose less money if you don’t do it. I mean I know companies where they would lose less money if they stopped all of their innovation. I mean that’s absurd. From guy wants to do innovation but we’ve either got to confront this reality and become data driven or we’ve got to just stop the innovations just stop them stop deluding ourselves stop lying to ourselves about this and and just run it as efficiently as you can run it off until you get bought up and then go on to the next thing OK cool.


Tripp: [00:11:57] Oh yeah.


Doug: [00:11:59] I get all that true. I mean if you’re gonna make frickin decisions I mean. I mean I ran into this big time because we work across at the Eureka ranch. We work in so many different industries whether it’s oil drilling rigs. I mean for. I don’t know anything about that millennial price. I don’t know anything about that. And so what I’ve learned over the years is two things One is I don’t know what I don’t know. And second the more unique you are the more likely it is to be wicked good or wicked bad. And so if you want to stretch the edges you have to have a humbleness and let the data guide you not you put out stimulus and you get data on it and then you learn and then you pivot and you keep doing that.


Tripp: [00:12:42] Before I move on to the next segment of the book segment how do I know when I need you. So for instance if I’ve already got an idea out there and people are struggling and even in your Driving Eureka! book when you start to talk about these things you talked about that where someone had called you in and they were looking to help their team and it was it was already too late but so. So you know if you’re not at on the front end of coming up with the ideas and you haven’t built some of these things into it can you help a company that’s in the middle of a of a of an idea and they’re struggling with it.


Doug: [00:13:24] Well it was struggling or not it can always be better. And so you’ve got to put a foundation underneath it and there’s three fundamental principles that you need in order to be able to do this. And if you if you’ve gone along and you’ve gotten maybe a good run but now you’re running to bumps or even if you’re not you’ve got to have a foundation underneath the idea that you’re doing. And so the three principles of today associated with smarter innovation decisions are first off you’ve got to build your systems so you can test at the speed of decision making. Okay. In the past we would have a false sense of precision and we would take a long time to put test together to run them to analyze them. Well then it’s not worth testing because you’ve got to make a decision. And especially when you have lots of iterative decisions you have to go faster and the good news is is that by using some of the new technologies and some of the new digital technologies and new artificial intelligence of systems that exist that you can cut the timing and cost by up to 90 percent in fact one multinational that we work with found that they actually cut it by 93 percent versus the classic costume to do this. And so you’ve got to build speed because the longer it takes to do a test the less likely somebody is gonna go get data.


Doug: [00:14:44] So you’ve got to vaporize the time and the cost whether you use us or other systems. You’ve got to vaporize the time you’ve got to do it fast. The second thing you’ve got to do is we have a tendency to want to look for the magic test. This one test that’s going to answer it well in carpentry they call it measure twice cut once and I say measure twice or three times decide once which is that sometimes there isn’t a test but you can triangulate on it by using different methods whether it’s Delphi artificial intelligence systems rapid research truth teller. I mean there’s different methods you can use that can triangulate for you to get a sense on it you can get perspective on your thing from two or three tests that you look at. Remember research is its decision support. It’s an aid to your judgment. It will not give you the answer but it makes it so that you can make a smart decision. So testing at the speed of decision making measure twice or three times decide once and then the third thing is you’ve got to confront reality by using risk adjusted forecasting. So there’s a tendency in the Western world to want to know the answer.


Doug: [00:16:03] How much are we going to sell. How much are we going to save in money if we do this cost savings thing well if you do it and you hit that number then somebody cheated because the number you’re not going to do is that number because this variance in that number and and I’m sorry folks I understand that you didn’t like the statistics course. I understand you don’t like standard deviation. I understand that you don’t like uncertainty. You want to know the number well you’re gonna have to get over yourself because uncertainty is fundamental to all of these measures. I mean we’re projecting the future and the way to get smarter instantly instantly is to do risk adjusted modeling where you’re modeling using the standard deviations using the variance using Monte Carlo simulations five year trial and repeat diffusion models. And if you don’t know those words I’m saying go hire somebody that does or rent somebody that does because you’ve got to do it.


Doug: [00:17:03] OK if you don’t you’re at a disadvantage versus others. I mean we had NIFS the Nationals Institute Standards and Technology literally validated our forecasting system the first time ever for forecasting technologies in federal labs. I mean which is about as difficult to do as anything in the world but you can quantify these things you can bring it but you’re gonna have to model the uncertainty and what that does is it gives you risk adjusted when you do that you end up in a situation whereby whether you’re a commercial company a consumer company a business thing.


Doug: [00:17:37] I mean we once did a thing with with the EPA. They wanted to put in regulations for refrigerators and they wanted to know if they put this in would how would it impact sales of refrigerators. Was there a consequence to it. And we found there wasn’t. And they did it. I mean refrigerated design changes by regulation we can model that do not tell me that you can’t. But if you turn around and you have a high speed system people are going to use it. If you use multiple measures and if you use probability modeling risk adjusted forecasting then you can dramatically reduce risk. It is a no brainer to do this I’ll tell you though that it starts with somebody saying I need to learn more because they didn’t teach you this stuff in school because it’s new folks you go to your research people they don’t know about it it’s new folks. This is a new world that exists now that is now starting to go and it’s built into the innovation and movement it’s built into the tools we can help you. But there’s other people that can help you get the newsletter. The words that I put in here people know what those mean. If they’re in statistics and if you don’t get statistics get it rent it because you know failure is your other option.


Tripp: [00:18:53] Okay so. And I know those guys I falsely promise that we move on to the next segment. But I got to ask the question. So when you say triangulate on Decision Support data can you. Can you dumb that down a little bit for me. What does that what exactly are we doing there.


Doug: [00:19:11] Well so say you’re looking at a situation whereby you’ve got a idea going to the marketplace but you’ve got Route to market that might involve distributors and industrial companies or distributors retailers and consumers or fund raising major donors then minor donors and government grants you might have two or three groups that are truly at the end of the day are the fundamental dimension.


Doug: [00:19:45] And so one way to do this is that you do an assessment of each of those groups to see what’s the probability that each one of them is going to open up and support you that’s one way or another way B is like when I was talking about the EPA with the EPA we had to use a collection the truth teller system which is a calibration system. We had to use consumer data to go do it and we had to use both of those numbers and we got two different measures of it. And then we used the variance between those two to do our modeling OK.


Tripp: [00:20:25] And are the were the numbers then very far apart.


Doug: [00:20:29] And so sometimes they aren’t. OK. Sometimes they can be very far apart and sometimes they can be very close. I mean we we will do tests where we’ll take the idea and we’ll test the communication clarity of it which impacts what’s going to happen at the same time. Will will score consumers first opinions on the idea which which gives us an opinion on the thing and and then we might turn around and use the truth teller system which gives us a more business model thing to say.


Doug: [00:20:57] Does this whole idea have the scale that we need to do. Sometimes those will all come the same which gives us a high confidence and sometimes they’ll have high variance. Now you say it’s easy get high variance that means the tests aren’t good. No it means there is high uncertainty. I mean just even measuring your customers I can take customers and ask them on a zero to 10 scale how likely our to buy it. I can get you know for everybody comes in four or five six or I can and I have an average of five point five. I can also get a 5.5 with a whole lot of ones and a whole lot of nines okay. And what we found in the past is when you have that high variance people say well obviously it’s a polar idea. Well interestingly we match that with clarity research. We find the idea isn’t clear. The person giving it a nine is interpreting the idea one way the person giving it a two is interpreting it in a different way. And so when you combine clarity research with the consumer research I now get a better sense as to what’s going on and sometimes we can then fix it I mean cause the first thing you do is you want to fix it. Want to make it work. But sometimes we have to accept the fact that we don’t know what we don’t know and we’re gonna have to see what happens in the marketplace so we might need to run a test market first before we scale up okay.


Tripp: [00:22:26] And then that informs then the judgment types of decisions associated with with that particular product. And how how it’s moving forward.


Doug: [00:22:35] Yeah. It makes you make decisions on your investment strategy. Are we gonna build a big factory or are we going to you know work with contractors to do it for a while before we find out so we can mitigate our risk okay.


Tripp: [00:22:48] And so the two parts for the Eureka! Ranch are basically the one is you can come in and help them come up with these data or two you can teach them how to use.


Doug: [00:23:04] Well that’s right. But the first thing is is that just take the newsletter take it to somebody that understands statistics and you can have them help you. Okay. You can just do that. I mean the stuff I’m talking about probability modelling multiple measures speed through digital systems. Those exist in the world we happen to be experts at it. And if you need help all us and we can help you do it. But that’s really up to you.


Tripp: [00:23:27] Okay. Let’s move to the Driving Eureka! book segment from here at


Tripp: [00:23:41] It’s time now for the Driving Eureka! book segment with author and inventor Doug Hall.


Tripp: [00:23:54] So now we’re talking about problem solving with data driven methods which is obviously the sub title to the book Driving Eureka. We’re talking about reducing fear what fear are we reducing.


Doug: [00:24:11] Well with all new ideas this fear fear of the unknown fear of failure etc. and we know that fear has a direct correlation on your organization’s ability to create communicate and commercialize Big Ideas attempt to make stuff happen. And there’s two fundamental ways to reduce fear as I write in the book. And as you mentioned I mean it’s important to note that the book’s subtitle is The book is Driving Eureka! problem solving with data driven methods.


Doug: [00:24:39] So you might notice I’m a little bit passionate about this stuff. Okay. I mean it’s like people say innovation is too risky it’s like the if you don’t know me data yes. It’s not just risky it’s stupid it’s stupid. Now when fear happens it’s two ways to do this. The first thing is to make the unknown known. And as I write in the book you we do this by writing the idea in a very clear way for the customer or the stakeholder. What’s their problem. What’s the promise we’re making. What’s the proof that we can do it. And the costs with it OK as well as a math game plan that says how the hell we’re gonna either save money or make money. OK so pretty some first thing is is make sure we have clarity of what the idea is because we may all be talking about a different idea and you find that over and over again defined the idea from the perspective of the stakeholder in the case the system and the customer in the case of the products or services improvements or whatever it is. The second involves the data which is doing what’s called the demi cycle or plan. Do study act or PDVSA where we use rapid experimentation and we gather data and we use data to guide our decisions in a plan to study act is not a wandering in the wilderness thing. It’s the plan the do and the study and the study ideally is against something factual and more more factual than data. And so we use that to do that and what we find is when you embrace this. When you have teams of people together fear has an exponential kick in the number of ideas the lower your fear the more likely you are to create communicate and commercialize meaningfully unique ideas that can make a difference for your organization.


Doug: [00:26:26] So we want to drive out fear and we do it by making that unknown known and by adopting the cycle of plan do study act. And that means data. I know Deming said the most important things are unknown and unmeasurable. That was true in 1980. It’s not true today with computers and probability modeling in Delphi and artificial intelligence systems. We can put quantification on these things that’s the reality.


Tripp: [00:26:57] Ok so you know as I say and again I want to reference back to the story you tell in Driving Eureka! which was I was the water company. They had a way to filter water but they’d only run one test really at the very beginning and then nothing for I don’t know how long but it was a long time. And as I’m listening to you talking it is one of the biggest problems that you run into then is the fact that they test that idea at the beginning and then they never run a test on products or new services again in the development process.


Doug: [00:27:37] Yeah and sometimes. So that was an example of I met with them in the morning I’d been brought in by senior leadership the team had been going for 18 24 months I can’t remember and it was not going anywhere and they said that they could filter water that made it taste better than bottled water. I saw Spring Water I said Well that’s wonderful it’s wonderful. I said how much better. So while we don’t really know that’s what you mean you know I said well we don’t really know. I said and this is a big campus. I mean it’s a big big company. I mean this like all these buildings everywhere you know would B-1723 you know the kind of place these rooms.


Doug: [00:28:11] And I said well you’ve got a cafeteria here. Oh yeah yeah we’ve got a cafeteria. It’s a do over in the school a lot of people go there. Oh yeah. A lot of people there. I said What do you say we go out and get some spring water and get yours and we’ll just set up at the cafeteria and just have them taste a and b to a paired comparison. Yeah we do. Oh we can’t do that. Why can’t we do that. Well we couldn’t get the water. I said well I got a rental car. I can go get it. I said well no no no. We’d have to get permission.


Doug: [00:28:40] I said at P&G I used to do this stuff all the time. I mean they think it’s cute. No no we can’t do it. And they came up with every objection most of which made absolutely no sense. And I finally came to the conclusion that they didn’t believe their product was better inherently. They did not want Tim Feely at Procter and Gamble once taught me. We had it we’d run a test and the test bombed. I mean I put a team of three. We put nine products on the market in twelve months. And the reason we’re able to do that is because of exactly what I’m talking about by the way folks. I’m this is my life is my life. And we’d been devastated. And Tim Feely product development. Just the genius of a product developer while we’re all retired now. He came in and he said numbers are our friend. He said he said this is a great moment for you. Now you know that doesn’t works. Now get ready and do your next one and. And it was funny but it’s like confront the reality now or later. Well they didn’t confront it and six months later management didn’t they shut them down and they didn’t need to because even though maybe their product wasn’t better they could have done. I’m gonna go geeky on you now sensation transfer to find out what traits were they winning and losing which is the easiest test in the world. And I don’t know what’s happened. I mean let me tell you this if its your organization you don’t do paired comparison and sensory transfer then then then you are at a major disadvantage. You are at a major major disadvantage. I mean just thinking call a Ranch and have us just do a presentation for you on what this is. But you got to start doing it. You got to start doing paired comparison. You promised your kids Survey Monkey won’t let you do these things because they don’t have those.


Doug: [00:30:23] So. All right. So guys came out I don’t know how to do this. They don’t not the statistics. They don’t need to do it because we become simpletons and I’m tired of people telling me innovation doesn’t work when they’re not using the skills of the state of the art of today. And don’t tell me that’s what we’ve always done it because that’s what we’ve always done. It’s never work. So stop doing what you’ve always done because it’s not working.


Tripp: [00:30:47] Ok. Well we’ve talked about one of the two things you just mentioned but I’m not familiar with sensation transfer. What is that.


Doug: [00:30:56] So it’s sensation transfer is as you set up to offering. So as an example we might set two whiskeys and we might take a really expensive whisk(e)y that costs two or three times what our whiskeys cost. We’ll have them test product P and product Q. no ID they say which one of them more likely to buy which one is more new and different. And then we’ll ask them which one has a better aftertaste. Which one has is richer which one feels like it would taste would be more expensive which feels like it’s older which has a greater smoothness to it. A collection of traits associated with whiskeys and then what you find is you find a couple of things one is you find where if you’re losing on the overall measures you find what are the things that you’re really losing badly on.


Doug: [00:31:48] So you know like our smoke product you know in the beginning what would happen is as we say which one has a better aftertaste and we would get flushed I mean we’d lose like 10 to one as like we don’t fix up art. It points you to where you’re strong and where you’re weak and it gives you a better sense by the way it’s obvious times fun for us because you know here we’re taking this young juice that we’ve used time compression on and we’re winning two to one in three to one on tastes older it tastes more expensive. And that’s seriously cool. And that is seriously cool when we do that.


Tripp: [00:32:24] So so what’s the difference then between a paired comparison test and a sensation transfer.


Doug: [00:32:30] What was sensation transfer it’s just an expanded paired comparison so prepared Harrison but then you put the traits down and you keep doing the comparison between the two and you use that to do it. You can also then turn around and do a regression with it which we do automatically in our software which can tell you what are the factors that are driving appeal. What are the things that are driving people to like it better than them or you. What are your strengths what are your weaknesses. I mean this is deep information much more than saying what you like and what you dislike which is tends to be imperfect. This is really good solid data. I mean frankly that’s why within a year we went to have a double gold at the North American bourbon whisk(e)y championship. It was because of paired comparison. I mean it’s not like we’re geniuses it’s just we are data driven and most of the entire industry. I mean I’ve tested more whiskeys than anybody on planet Earth. I mean I did stuff for Diageo for Edrington. We’ve got by far what we have the biggest database in the world on whisky testing and and you know this is one of these where the data captures the rich get richer and the poor stay poor. I mean you know that’s just the way it is.


Tripp: [00:33:39] And I remember. And for folks that are listening in episode 7 we talked about paired comparison testing and the Brain Brew Whisk(e)y Academy segment and how much of a differentiator is and the fact. And you mentioned in their Doug about the fact that companies aren’t doing these things that that. And how crazy it is what you by your voice indicated the same thing. But I just wanted to mention that these are things that companies can do today to make to differentiate themselves and improve their products that they have out in the marketplace.


Doug: [00:34:18] Well. And I one of my New Year’s dreams for this year is to get on this issue and so to that end and I’ve been ranting about it so much that Maggie Nichols the CEO of the Eureka! Ranch now she’s actually started up at various points we’ve had at a market research group which is the Merwin decision support group.


Doug: [00:34:43] She’s just started that group up again because while we teach people to do it what we’re finding is there’s a whole lot of companies where before we can even teach them we’ve got to just show them. And we’ve just got to do it. And so we have a new ad she’s just hired a lady Vicki who’s just amazing from one of the top top research firms in the world. One of the world’s experts at forecasting and show with her and Greg and different things. We’ve got a whole team where we can just come in and just do it for you.


Doug: [00:35:11] Ala Carte because we’ve got to do something about this. If we don’t get towards data then everything else we’re talking about with innovation engineering everything else we’re talking about is not needed because if you don’t do data then it’s just the same stupidity we’ve always had. And so then just put some design food foolishness on it that’s not going to solve the problem facts and data are our friend as Tim Feely says OK.


Tripp: [00:35:37] So and that’s the reason this team’s been assembled than as is the help organizations come up with decision data so that they can make make better decisions.


Doug: [00:35:48] Yeah we think of it as a three step process. The first thing is they’ll do allocate. They’ll have a test. They’ll have us do some tests for them and then the second thing they do. We’ve put together a package where it’s a monthly package where we show up on a regular basis and just push them to do more testing and to help them do it. And when we do it that way we’re sort of it’s halfway between that and teaching them how to do it. We’re enabling them to do it. And that cuts their costs by 70 or 80 percent for the tests. So we literally have taken the cost of us doing the test and cutting it by 70 or 80 percent. In the process we’re enabling them so that they can sustain it. So and then we go all the way to if they want to learn innovation engineering but it’s a step ways process to meet people where they are. You’ve got industrial companies that have never done research in their life really. I mean they’ve called it research but you know show me the statistical significance and the standard deviations and now you’re doing research no standard deviation no statistical difference. And that’s not quantitative research and qualitative. Nice but I wouldn’t bet on that.


Tripp: [00:36:57] Okay. And the one last thing I want to cover off before we go to the Brain Brew Whisk(e)y Academy as you show I don’t know if this is the effect of fear a meaningfully unique ideas that are invented low fare groups you say 40 to medium fear groups 34 high fear groups 31 how does this fit into what we’re talking about.


Doug: [00:37:19] Well this is just a measure of teams when they were creating meaningful unique ideas where we track them and this was a fundamental study that Dr. Chris Storman and I did that showed that the impact of fear was was massive. Literally people would shut down in the face of fear. And so it’s just it’s a torture test that just shows that there’s now other people have shown the same thing. But to see it’s so striking a group of people in the process create ideas same pattern same groups. This was done over many years. It was it was pretty significant. And if you think about the cumulative effect of that it gets even worse.


Tripp: [00:38:00] All right well let’s move to the Brain Brew Whisk(e)y Academy


Tripp: [00:38:09] This is the brain brew whisk(e)y Academy podcasts where we will take you behind the scenes so you can see what it takes to build a whisk(e)y distillery business. Eureka ranch team led by Doug Hall are creating a craft whisk(e)y company with patented technology like has never been done before


Tripp: [00:38:33] New data new learnings from this theme of data. And so this is your practical application of using data in the whisk(e)y business so that you can demonstrate actually how this stuff works. So what are you learning here.


Doug: [00:38:50] So last weekend we ran a huge test of whiskeys in whisk(e)y cocktails. We had a huge group of people. They tasted them and we went to I don’t know was 18 cocktails 12 14 whiskeys.


Doug: [00:39:09] Some people do both. Some people did one we did micro testing. They sent samples and so that they weren’t drunk. We already took care of that. And and so we ran the tests and it’s interesting going into the test because you know you have hypotheses of what’s going to work and what’s didn’t. And we found some cool stuff some things confirmed and some challenged some others and some have opened graded open questions that we don’t know about. OK so we found that our four core whiskeys actually appealed to different groups of consumers.


Doug: [00:39:43] And so there’s a very different profile for the people that like whether it’s the Edrington set of Relativity Noble Oak and noble oak rye and and the smoke product or it’s our Riverboat collection. Is this a very different profile to those and that so that was cool to learn.


Tripp: [00:40:02] So those those four offerings then that you just mentioned is that broken out. Does that happen to break out by know barley wheat. Rye and.


Doug: [00:40:15] Well there’s differences in them but the big point is this it’s as different people.


Doug: [00:40:20] So what happens is it’s the light to new whisk(e)y drinkers like the Relativity product massively more at the end and the Noble Oak product really appeals much more to those who want richness and and and so a way to think about it is. And I’ll talk about this in a minute but whick is red versus white wine. You know in a simple sense Relativity appeals more to white wine drinkers Noble Oak appeals more to red wine makers drinkers.


Doug: [00:40:58] The the rye product appeals more to IPA drinkers the stronger. I mean it’s just very fascinating to see the dimensions and we’ve got more analysis to do what we’re putting together a three dimensional profile we’re running now t tests across them putting the other profile but they are distinct and in fact the the way I say that they appeal to different groups is we took the appeal of the four and we ran what’s called a correlation matrix where you compare them against each other. And the appeal of each product doesn’t correlate with any other product so the people in general who like one are different than the people are like another and that’s what it tells us that. So what I mean what we’re saying is is that they’re different and that means they bring different customers in.


Tripp: [00:41:48] So when to be fair to say that they’re kind of separating themselves into groups like you in your example of red wine versus a white wine drinker you try and different groups from there.


Doug: [00:41:58] Yeah. There’s psycho graphics to it and that cocktail trendsetter and different things like that and we’re putting together these profiles for them. But the neat thing is I mean it’s like if Relativity and Noble Oak co correlated you know as the person who like one also like the other and who didn’t like it like the other then you’d say well one of these products isn’t needed because it’s the same customer but they don’t.


Doug: [00:42:21] None of the four correlate with each other. They’re all different. And so these are different markets so we know this is a good business decision to have four products. You always look at it and you know do I really need those forms and when I can get rid of. No no. There are four different groups of consumers.


[00:42:36] The second thing we confirmed is while only 18 percent of people in the U.S.. And this is from a gargantuan study. OK so this is real data. Don’t say it’s not true. It’s the fact. Only 18 percent of people in America in the last twelve months drank whisk(e)y 18 percent.


Tripp: [00:42:54] Is that any whisk(e)y. Whether it’s neat or mixed it was.


Doug: [00:42:58] Anyway anyway. OK hmm. Oh by the way 18 percent drink whisk(e)y 20 percent of those drink it neat. So that would be three point six percent of the American population trick. Let’s just be specific there. Get over yourself. 90 percent have a whisk(e)y cocktail they love. So we had a wide group of people people that didn’t drink whisk(e)y. And what we found is there is a cocktail that works for 90 percent plus and you know whisk(e)y puts a depth in cocktails that you don’t get from gin or vodka and that kind of stuff.


Tripp: [00:43:34] Ok just for clarification is that 90 percent the 90 percent of the 18 percent or 90 percent of the 100 percent OK. OK.


Doug: [00:43:45] And we were also surprised that we’ve got to dig deeper into this. But there’s a cocktail called the Sazerac which I don’t think we’ve done yet.


Tripp: [00:43:53] Yes we have. Yeah we did.


Doug: [00:43:55] OK well Sazerac is a top I gotta make a list of all the cocktails we’ve.


Tripp: [00:43:59] We need a cookbook together.


Doug: [00:44:02] I know to do that.


Doug: [00:44:03] I should start to put them pull them out. But the the Sazerac is a new orleans classic It’s a very boozy cocktail but it’s got a really good balance to it and against other boozy cocktails where they dropped off the Sazerac had a much broader appeal than I thought. And so it made me think that we’ve got to think deeper you know we can’t just stereotype and say a boozy cocktail where there isn’t you know it’s not like a highball with a lot of liquid in it or something that’s not going to appeal to non whisk(e)y drinkers. It is possible to get them but you’ve got to have more balance to it. And this is a balance that’s in the size working some people like the liquor store. Some people don’t. Which is a different issue but there is there is more potential for cocktails than we understand. So we have to think about that surprise. And that was a surprise. Another surprise is that so we have this dream to create this whisk(e)y wizard that guides people to the right whisk(e)y and cocktail and you know that we would add some people at the ranch to saying this is never going to work.


Doug: [00:45:06] I mean it’s this way. It can’t work. But we found that some things you’d expect like frequency of whisk(e)y drinking preference for Scotch versus bourbon. Very different profiles of the products that people drink and things like that. And then we found some things that I would not expect the red wine versus white wine that I talked about that literally sent you I mean literally the four products I talked about. You’re going from white wine to deep red wine. All right. You know the big cabin ace.


[00:45:36] I mean it’s almost like I can ask you what wine you like or what kind of beer you like. And I can tell you of my four whiskeys which one you should try it is spooky it’s spooky. Same thing with whisk(e)y trendsetter. We asked people Are you asked for ideas and advice on cocktails. And sure enough that will point you towards different ones. Do you like hot and spicy food that’s going to point you to certain cocktails. I mean interestingly hot and spicy food.


Doug: [00:46:05] There’s a there’s a famous Cocktail in New Orleans and again New Orleans great drinking city. Brennan’s brandy milk punch and the team the Brain Brew crew did a twist on this where they did a bourbon milk punch but when you use half and half. Which I’m going to talk about in a minute. Well let’s talk about it. Bourbon milk punch where we replace the brandy with bourbon and and so we’ll.


Doug: [00:46:36] Take an ounce and a half of noble oak bourbon.


Doug: [00:46:39] Two and a half ounces of half and half the cream.


Doug: [00:46:43] An ounce of cream of coconut shake it.


Doug: [00:46:47] And then put some shaved chocolate on the top of it. You know just great some some chocolate bar into the top of it and and so you know the bourbon milk punch brandy milk punch is classically simple serve and vanilla in place a simple syrup and vanilla. We use cream of coconut. In fact Lydia Carson of the ranch actually invented this thing that we were doing and and it and it’s amazing.


Doug: [00:47:14] And people who like tight and spicy love this cocktail right now. Hey you. I’m thinking that they need the cream to cool down the heat or something. But you know and I would never guess that I would guess that people that were like tight and spicy would like the smoked whisk(e)y I would guess that but I would never have guessed the bourbon punch and it was also the number one cocktail with non whisk(e)y drinkers. And I would never have gone there. Mm. They loved it. In fact during the tasting what happened is we happen to make up an extra batch of it.


Doug: [00:47:49] And and as we’re going around we had this big room of people I don’t know 70 80 90 people are not like that. And so we’re serving them glasses out at the bar we’re making this stuff up.


Doug: [00:47:59] And I went by the tables and I like three of the tables they say you got any more of that one you got any more that one. And I went out and guys got him. Oh yeah. We made an extra batch. I bring it in and they literally passed it table or it was popping up. It’s it’s Hitler. And I would tell you craft distillers out there make the the brain brew bourbon milk punch make the bourbon make punch. Try it try it tonight. If you do nothing else I don’t care if you hate hate whisk(e)y.


Doug: [00:48:27] Get yourself a nice craft bourbon. I’ll always going to support the craft guys or get our Noble Oak bourbon which comes from a craft company in partnership with Edrington get the Noble Oak and some half and half cream of coconut and rock on man.


Tripp: [00:48:44] I mean it’s kind of a laugh to try that one too. I’m just. Just a note. It was episode 9 or newsletter 9 that the Sazerac recipe was talked about. So folks want to go back to that. They can go back to Episode 9 and listen to the Brain Brew Whisk(e)y Academy segment or they can listen to the Brain Brew Whisk(e)y Academy podcasts and get it from there. Well you one of the two.


Doug: [00:49:11] I also mentioned to folks I know there’s a lot of folks in the craft industry or there’s people that are thinking about creating their own whisk(e)y. We will be in Minneapolis for the Craft Spirits Association is having its annual conference in there.


Doug: [00:49:28] It’s called American Craft Spirits Association ACSA. It’s the sixth annual conference for distillers and that’s in Minneapolis the 10th 11th and 12th of February.


Doug: [00:49:42] Oh 0. And we’ll be there. The Brain Brew Crew will be there in Minneapolis. We’ve got a we’ve got a booth there and we’ll be talking to folks. And if you’d like to connect love to have you come by. That’s 10th 11th and 12th. The ACA AC as a distinguished conference in the Hyatt Regency regency in Minneapolis.


Tripp: [00:50:04] Ok. We’ll be sure to put a link out there to that in the show notes then too. Very good. Well this was interesting. We went in to quote things I’m sure we’re going to revisit some of these things on data as we go into future newsletters and and episodes of the podcast. But now I that we got it we got a little of everything there we got the oh the issues associated with the data the fact that it’s out there and now you’ve also got using the data you know applying it to the to the whisk(e)y. So very good.


Doug: [00:50:45] Excellent well the whole year the whole year we’re going to be focused on data. That’s my focus. OK. Let’s get smarter. All right. OK. Thanks


Tripp: [00:51:06] Is it time to use artificial intelligence (AI) in Innovation forecasting? We believe the answer is yes! The Truth Teller system shows evidence of picking winning innovations 7x better than executives. Be sure to look at our show notes for this episode to download more about the Truth Teller system or use go.drivingeureka.com forward slash truthteller

Leave a Reply

Your email address will not be published. Required fields are marked *

Driving Eureka Podcast © 2019