Coming To The Right Conclusions, Mental Models, Disproving Hypotheses and Not Fooling Yourself: Smartcuts Miniseries Pt.6 with Shane Snow
Click a logo to listen on your platform of choice or scroll down to listen here on the site:
I’m extra excited about this one.
So a lot of you know that I have listened to Shane Snow’s audiobook Smartcuts many times and how much I was looking forward to having him on the show for the first time back in April. Well the only thing better was when he agreed to come back and do entire deep dive miniseries into the stories of those who have built these skills of avoiding unnecessary work and, instead of slowly climbing the “supposed to” ladder, building their own ladders to success faster, as well as how-tos of for how the rest of us can do the same thing.
A new episode of the miniseries will be out each Friday for the next 6 weeks.
If you missed the first time he was on the show you can listen to it here.
Thanks for Listening
Jess
P.S. If you like the episode please shoot me an email and let me know what you liked about it: Jess.Larsen@GraystokeMedia.com
Bio:
Shane Snow is an award-winning journalist, explorer, and entrepreneur, and the author. He speaks globally about innovation and teamwork, has performed comedy on Broadway, and been in the running for the Pulitzer Prize for investigative journalism.
Snow has helped expose gun traffickers, explored abandoned buildings around the world, eaten only ice cream for weeks in the name of science, and taught hundreds of thousands of people to work better through his books, including the business bestseller Smartcuts.
Snow's writing has appeared in GQ, Fast Company, Wired, The New Yorker, and more. He is also a board member of the media technology company Contently, and the journalism nonprofit The Hatch Institute. Make sure to check out ShaneSnow.com. Follow him on LinkedIn.
Below is the auto-generated script from the interview:
[00:00:00] Jess: Welcome to Innovation and Leadership I'm Jess Larsen. This is part six of our Smartcuts mini series with Shane Snow author of the book, Smartcuts, Dream Teams, Storytelling, Edge, many articles in fast company and Wired and all sorts of other accomplishments, including building a multimillion-dollar tech brand.
[00:00:18] So, Shane, what are we going to talk about this week?
[00:00:21] Shane: All right. So today we are talking about coming to solid conclusions and mental models for making sure that you're not fooling yourself when you are exploring your own ideas and creative hypothesis. So we've gone over for quite some time now how to basically lead up to and come up with innovative ideas and to use lateral thinkings, to think of, I think differently to think of things that haven't been thought of before to change the game.
[00:00:48] Now we're going to talk about making sure that your creative lateral ideas are any good. So this is the next step in the scientific method. After observation question hypothesis, it's experimenting on your hypothesis or bulletproofing your hypothesis. And really the main idea here is you want to try to disprove your good ideas rather than just try to prove your good ideas.
[00:01:16] So a couple of episodes ago, we talked about the bacon method. Which is, Sir Francis Bacon is really the pioneer for the scientific methods saying that we need to be methodical and scientific when we're trying to make discoveries and come up with new ideas and prove that they're true. So what this stage in the scientific method is, is it builds on Francis Bacon's work, with the work of Sir Karl Popper and what I call the popper method.
[00:01:42] And, and I think his last name is apropos because. Often at this stage, you get your balloon popped when you realize that your great ideas actually don't hold up, but that's actually really important. So Sir Francis Bacon basically said that, the key to the scientific method is doing experiments to prove that your hypothesis are true, to prove that your great new idea will work.
[00:02:09] Sir Karl Popper said, actually what you need to do is do everything you can to disprove your hypothesis. And if you can't, then you're onto something. So this, it sounds like a subtle distinction, but it's extremely important. So that's what we're going to talk about. And really it's a lot easier to fool yourself when you're trying to prove something than it is to disprove something.
[00:02:32] It's a lot easier actually to disprove to shoot a hole in an idea than to prove an idea. And if you can't end up shooting holes in an idea, then that's a lot more powerful. Of a solution to a problem or creative idea, then something that you just have done, everything you can to try to prove it. So to go back to the analogy that we used before with the bacon method, the literal bacon method, I said that, you know, bacon tastes great on everything.
[00:03:01]You need to, to actually observe that and, you know, try it out, put bacon on the sandwich, or put bacon together with chocolate in order to prove that it's tastes great on everything. Well, it's not quite the right way to scientifically go about, coming up with breakthrough ideas, when you try to disprove the theory that they can taste great on everything, that's where the rubber really meets the road.
[00:03:23] So it turns out that if you put bacon and peach sorbet, it's actually quite disgusting. If you put bacon in a bowl of fruit loops, it makes the fruit loops worse and especially when the bacon starts to get soggy. So when you go through that exercise of trying to disprove that theory that bacon goes good on everything,
[00:03:40] it's easy to pop the bubble. But it also helps you to refine the hypothesis, making it goes great in certain types of things, and that's useful information and it's more useful than going out and staking your whole career or business or whatever it is on the hypothesis that bacon tastes great on
[00:03:56] absolutely everything. You would, at some point, actually, if you were actually betting money on this, you would be losing money at some point. So point is we often use our Smarts to try to debate to prove that our great ideas are right. And if you've gone through this whole process of using lateral thinking to come up with new approaches, to solving problems, you will use all of your intelligence to try to prove yourself
[00:04:22] right, and naturally I would say, you'd be smarter to try to use all that intelligence to then try to disprove your ideas and then whatever makes it through that process is what wins. So that's really what we're here to talk about today.
[00:04:37] Jess: I love it. Well, let's, let's jump right into it.
[00:04:42] What's the first thing you want to cover there.
[00:04:43] Shane: Okay. So the first thing is the idea, well, really, before we get into sort of the, the methods of disproving or proving a hypothesis, the first thing really is this idea of inversion, which this whole thing disproving versus proving is an example of it, but inversion
[00:05:03] is basically the principle that says you can, better understand the path to a successful outcome by flipping the, whatever it is you're doing on his head. So instead of, and this gets back to kind of what we talked about a couple episodes ago with asking the right questions. But instead of asking, how do we encourage innovation at our company, which will lead you to try to find ways that you can prove that will encourage innovation
[00:05:34] and you'll look for confirmation that the things that you're putting into place, you know, your processes and your policies are helping innovation, if you invert that and you say, how could we discourage innovation? And then you look for where you're doing that in your company and where you could remove those things that ends up often being a much better way to solve a problem.
[00:05:56] So that's really what this, this part of the scientific method is about. It's about this inversion process. So another example really relevant right now, people are out protesting and most of them peacefully protesting police violence, and some of those protests are turning violent themselves
[00:06:13] in response to the police violence is really tricky situation. It really sucks. It's a systemic problem. So we could ask ourselves, how can we discourage police violence? And we can come up with all sorts of hypotheses, you know, punish police officers more, or we, you know, whatever it is, or we could ask, how can we encourage police nonviolence?
[00:06:33] The inversion, there leads to a whole different set of thinking process and actually encouraging nonviolence might be a better way to disprove some of your hypotheses around getting police, how you could get police to be less violence, so that's the idea. but the main idea that induction or inversion leads us to is basically the difference between induction and deduction.
[00:07:00] So if you're familiar with your Sherlock Holmes, Sherlock always talks about a simple deduction Watson, you know, he solves these problems comes up with these theories and then, you know, goes out and proves whether they're true. You know, he figures out who the bad guy probably is. And then he goes out to try and catch them.
[00:07:17] And he always talks about deduction. And I think understanding induction versus induction is the first key point when trying to figure out if our ideas are good. So, induction is where you take observations. You look for patterns, you make a hypothesis, and then you come to a conclusion. This is usually what we do.
[00:07:38] So it sounds kind of like what we've been doing so far with that scientific method. You make observations, you look for patterns, you ask questions, you come up with a theory, but what humans usually do is we then make the leap to, this is the way it is. I've gathered these observations. And based on those observations, I believe this is true.
[00:07:58] What deduction is, is it's about when you get to that point where you have the hypothesis, you then want to test it and make observations from the test that confirm or disprove the theory. So it sounds nuanced, but do you see what I'm getting at?
[00:08:14] Jess: Yeah. You know, I think that, and correct me if I'm wrong, that it was a big deal when Bacon started promoting this, because he was basically like going against Aristotle and that, you know, like this was not, a lightly taken thing.
[00:08:27]you know, I think for me, the thing that stands out the most there is this idea of and correct me if I'm getting this wrong, but the deduction is essentially look at it from the top down and then filter down to a specific case. And inductive reasoning is this idea of look at a bunch of individual cases and work your way up to what that probably means generally.
[00:08:54] Is that, is that close?
[00:08:55] Shane: Exactly. Yeah. It's excellent. And it's not to say, I'm not saying that deduction is better than induction or both of them are useful ways of thinking. Often we don't have all of the information we need in order to make deductions, but we don't just have all the information we can sort of process of elimination, narrowed down to what the best answer is until we do induction, which is we make lots of observations.
[00:09:20] We look for lots of cases we build up, but I think about every cop show on TV, where you know, there's a murder trying to solve the case. And they always do. I have no idea if cops actually do this, but they always have this corkboard with photos of people and string between them. And you can't make sense of what, what it means, but it's, you know, they're building the case and they're like, there's a question mark, where, you know, this name, is, like I've mentioned before, I've been watching the wire.
[00:09:48] So it's like stringer bell. What's it look like? Here's the question mark. Here's his name? And we know he's connected, so, and so. that is this inductive process. You're making observations and you're gathering, you're making a case for something, but if you want to put someone in jail, you have to prove that your case is solid.
[00:10:07] And often there's a lot of questions that are, there's a lot of leaps you have to make cause there's open questions. When you're, you know, the cops are prematurely trying to, you know, to put someone in jail when they've put together this big case, the thing that you got to do in order to make the case to actually prove who's guilty
[00:10:24] and who's not is deduction. You have to go and then actually you arrest people, you have them testify, you ask them questions and they are trying to disprove you that they're guilty. So you're eliminating the suspects and there's often an experiment aspect to this. You know, we leap from induction to deduction,
[00:10:47] when the cops then go out and they ask the questions or they go out and they. you know, show up at someone's door, when the person that runs, that's more evidence, that's something's a foot it's building more of the case, but then they have to actually break things down and narrow down to yes, this is the person who did it because there's no other possible explanation.
[00:11:09]so basically the missing step that too many of us do is we get so excited that we've built the case for something, but then we don't go double back on the case and try to actually poke holes in it, to make sure that there's no way that this case will fail when it gets to court. So the thing to avoid is to argue with the force of deduction, which is you have all the data, you have all the facts, you've narrowed it down,
[00:11:36] there's only one possible answer or you'll, you can get to one possible answer because you have all the facts arguing with the force of deduction, when what you're actually doing is using induction. You're talking about probability based on evidence. That's the thing to avoid, and this is where, what we need to do this.
[00:11:53] I'm making basically like the fanciest, most nerdiest case for you want to run experiments in order to actually disprove things and be certain that what you're doing is a smart idea. Without those experiments, you're making the leaps that's induction, those experiments, the ideas to process the elimination
[00:12:10] and process of bulletproofing make sure that you're right about what you think
[00:12:16] Jess: You know, I think for me, something that some of this work we've done together, and these interviews has been helpful is, I think growing up, we were told to do stuff the scientific way or the, you know, it was implied that the scientific method
[00:12:33] was the right way, right. And in some ways it almost became like religion of like, be like do the scientific method because you're supposed to everything. And I think, you know, you and I have talked about the Shingo Institute with all the Toyota production system, lean stuff, and the way they embraced scientifically thinking and the way we've talked about it.
[00:12:54] And it's like, I almost feel like it would have been more helpful to me as a kid to have it compared of like, which one is more effective, the scientific method and put it side by side with other methods instead of do the scientific method, because you're supposed to. If it could have been framed as this has a higher probability of getting you the accurate answer you're looking for
[00:13:20] probably faster, but at least more securely, like an answer you can actually believe in. Because like, if they had been describing it more as do the scientific method, because it's more effective, because it's likely to give you an answer you can actually rely on for the long term instead of you're supposed to it, I probably would have embraced it a lot faster, you know,
[00:13:44] Shane: instead of the dogmatic.
[00:13:47] Jess: which was you're supposed to like, you know, you should disprove it, an experiment. Well, why? Well, because the scientific method says, so, and then we stopped and we didn't say we didn't go into the depth of, you know, and as you talk, I think about how many times, cause you know, confirmation bias shows up in my own life.
[00:14:07] In fact, I was commenting on your new LinkedIn newsletter, which by the way, everyone should go subscribe to, I was commenting this on your newsletter today, right. And I'm saying, Hey, how do I embrace more intellectual humility? Because I get married to an idea and then I just tried to persuade every on of it
[00:14:24] Instead of looking for, instead of like engaging the curiosity that might get me into something better. Right?
[00:14:29] Shane: Yeah. I really appreciated that comment by the way,
[00:14:33] Jess: will this, but this is the point though, the scientific method. As you've been like teaching me what I should have figured out in eighth grade or something is like, by bringing that lens to the issue, I'm less likely to get in like the tar baby problem of protecting my own hypothesis or, or protecting that.
[00:14:54] I look smart because I'm the one who brought it up and like now it gives me like the ability to feel good about myself because I was. Because I was able to disprove it and I was effective at figuring out that that would have been a mistake if I stayed married to that.
[00:15:08] Shane: You know, it's interesting. Cause I think about when I did the science fair, when, I mean, it must have been fifth grade and I did this experiment where I, I love these little army guide toys that I had.
[00:15:21] And I wanted to figure out what's the best way to drop them off the roof of the house and have them parachute down and like land on their feet, all safe and basically like which parachute will get them down the slowest or whatever. And, so I had all these different shapes of parachutes I made for my army guys.
[00:15:37] And it was really disappointing cause they all just felt like rocks. Like it was, they're very heavy compared to, you know, like a little paper parachute. And I was really disappointed and I like, basically like failed my experiment. I don't even remember what I did, but it was like, like I lost. Because all I did is I found a bunch of ways not to, reduce the drag or reduce the speed at which an army guy falls off the roof.
[00:16:05]I disproved a lot of parachute designs and I remember being really disappointed, but the real scientific method would be celebrating that like the game then if we were, as kids, we were taught, that the game is actually just try to figure out what doesn't work. If I find something that seems like it would work and figure out all the ways that could kill it.
[00:16:26] And you get really excited when you can't, but you get really you celebrate when you do like, ah, was able to kill the army guy, parachute thing, moving on to the next one. If, if things were framed that way, then I think a lot of the way that we debate and argue and talk about things would be a lot less defensive, honestly, because when someone does shoot a hole in your idea, or they bring up something, especially when they do it in an honest way, not like a sort of a red herring kind of way, which happens way too often.
[00:16:56] When someone brings up something that's wrong, you could feel a lot better about it. I'd be like, Oh wow, we're making progress. This is the point is to find the holes in the argument. The point is not to prove that you're right. You know, that habit kind of gets instilled from very early on. And I think it leads to, you know, one of the things I want to talk about today, intellectual dishonesty, fooling yourself, making, arguments that sound good, but actually don't logically hold up,
[00:17:23] and fooling yourself because the pressure to be right. And the point is to be right. The point is not the opposite. I think the problem is this scientific method of, you know, of deduction of disproving things that don't work takes longer. And there's not like a clear resolution all the time.
[00:17:42] You know, it's a sort of never ending game. And at a certain point, you do need, you do need to do to go with whatever, you know, works best, you can't spend forever on one experiment. so that's part of the problem. It's not as satisfying, but as so much more effective and so much less personal than to prove your right game.
[00:18:03] Jess: You know, I totally agree. It is so much less satisfying in the short term. But I look at the mistakes I've made in the investment fund industry. Right. And, or just doing startups and it's like, when I come up with this case for why this should work and I'm trying to be, and I'm trying to do like the proof, why it works kind of, kind of approach.
[00:18:30] I just left myself open to so many, any concerns and objections from much smarter investors, you know, much more, much better businessmen than I was, who were like, well, what about this? What about this? Right. And I'm so busy trying to prove that we're okay and you should go with us. And half of it is like lying to myself on stuff that I don't know
[00:18:49] cause I wish I knew. So, you know, kind of things. Right. And you know, a bunch of those businesses didn't work out. Right. And now, like I look at, we've taken so long to get this real estate investment trust going that we're about to launch here. Right. And we are, unfortunately, because cause of the pain of having done that in the past, we are just brutal with ourselves now.
[00:19:13] And we are working really hard to come up with every reason why somebody shouldn't invest with us and where we are legitimately worse than the competition and the things like this. And it's a couple of reasons. One. So I don't get caught flat footed and look dumb in front of a potential investor. Like that's not the right time to be figuring out an answer to this.
[00:19:33] Shane: Right.
[00:19:34] Jess: You get well before you're thirsty, but also it creates an enormous amount of trust when you're willing to own your weaknesses. Like when you can spare an investor in the face and you go, Oh yeah, we're worse than the other guys at that. And you don't think less of yourself because of it. You know, there's like, people are like, Oh, well, why do you think you can do this anyways?
[00:19:58] Oh, well, I'm happy to tell you. You know, and it's like the honesty, you know, I really am annoyed with all the trash talk in MMA and boxing. Right. Cause I grew up doing, I grew up doing competitive judo and it's much more traditional Japanese and there is respect for your opponent. And there is, you know, like you gotta have your game face on, but you do that internally.
[00:20:19] There's none of this public disrespect right. Which just turns into theater and says things that are objectively , the person saying it doesn't objectively believe they're just trying to convince themselves up or just trying to put on a good show, you know?
[00:20:32] Shane: Yeah. There's Oh, go ahead.
[00:20:36] Jess: Well, we were started the show.
[00:20:38] We were talking about the guy who was on the first episode of, of our podcast, chip youth, Kansas city SWAT. And he just talks about how we naturally discount an opponent that we've objectified and how that's inherently unsafe.
[00:20:55] Shane: Wow. It's, yeah. And so much of, you know, when I think about business turns into this adversarial kind of relationship, you're pitching investors and it's this adversarial relationship.
[00:21:09] And if you haven't done the homework that we're talking about and you roll into an investor pitch meeting and, you know, three minutes in and investor says, well, what about this? And there they have, there's something to that. You think that they're a jerk, you will trash-talk them forever. It will derail you anyway, you will take it personal.
[00:21:30] And you know, conceivably the event you're trying to pick a partner, you're trying to, to enhance your team with these people and a good teammate will help you to elevate yourself just like, you know, in any sport. But, so I did it Japanese jujitsu and high school. So similar thing, your opponent is actually your partner, you know, when you're practicing and even when you're sparring it's to help you elevate yourself.
[00:22:00] And rather than the person that you're trying to destroy, which is what the pageantry of, you know, MMAs is often about. And we end up having that kind of relationship with our teammates or our boss or our potential investors or our competitors when you know, if, if someone who shows you the flaw in your business is doing you a favor, cause they're helping you complete the scientific method that you didn't complete necessarily, or you didn't think through.
[00:22:26]and so if you treat them like they're doing you a favor, first of all, they're gonna be a lot more likely to want to work with you. But also it's going to be less personal and then you're not going to devolve into cheap tactic. You know, if you're, if you're sparring with someone, you know, competition and they get you personally mad and it becomes personal, you're gonna be a lot more likely to do something like mean or nefarious or whatever.
[00:22:49] Right. Illegal when the judges aren't looking and then hopefully you'll feel terrible about it later, but we do the same thing with ourselves, with our own businesses when we start to take things personal because of that adversarial thing. so I think there's a lot wrapped into to what you're saying there.
[00:23:07] Jess: it's interesting, right? Because it feels, so it feels so quick and easy if they would just agree. Now, if they could just be convinced now and let's face it, there's plenty of arrogant folks with a lot of money who you know, they’re using this as an opportunity to let you know, you're not alone, you know, and it's like, they get some of their feeling of superiority.
[00:23:30] bye poking holes in your thing, and it's not intellectually honest or helpful. The, the basis of that comment. And yet I think about the people I look up to the most who are willing to listen through the smoke and the arrogance and look for the kernel of truth to benefit from and just let the rest slide.
[00:23:48] Yeah. They're back and go. That's so interesting. And like embrace it with curiosity. I mean, I think about jujitsu and what a huge advantage is when an opponent tries to push you and rather than push back, you pull them and use their momentum. Right. This idea of. You know, somebody comes at you hot ripping apart, your, your investment thesis, your, your business premise, and you meet them with curiosity and trying to find out more about that.
[00:24:16] I mean, it just, they hardly know what to do with you because they were looking, let's say, so they're kind of looking for a fight,
[00:24:21] Shane: right? Yeah. Did I ever tell you about the story of the guy who tore me apart in the Atlantic? So. I a few years ago, I was at Cannes advertising festival in France, and my cell phone was almost dead.
[00:24:39] And I get this text from my PR person, and it's a link to an Atlantic article where my name is in the URL. So it was atlantic.com/technology/shanesnow something, something, and I wrote back and texted her back. Oh, cool. And she texted me back. No. Okay. And what it was is a, this guy I'd written this blog post.
[00:25:00] That was, it was a thought experiment, that wasn't very well thought through. And I think it had some really good points I still do. and it was meant to be a thought starter, but it also had some stuff that was, I now am like, ashamed to think about, like was really naive and not very sensitive, I think.
[00:25:21] And, but what this guy had done is a MIT professor who's a columnist for the Atlantic. He wrote this whole article about how the technology industry was doing terrible things by trying to solve problems. They didn't understand. And he used me and my blog post as the core example to this like 3000 word article.
[00:25:41] And it was like Shane Snow sucks because look at him basically, and look at the kinds of things he's saying. And, I was really upset and it was a good thing that my cell phone was dying because I didn't have time to like email him angrily or responding early on Twitter. so it took me hours before I could even get back and read it by that time I had cooled down.
[00:26:02] And so eventually like read the whole thing and I wanted to write him an email, making a case for why he was wrong, but I realized that he was right about some key things. I think he was mean spirited. And he actually admitted that he wasn't mean-spirited, when we talk later. /but he was right about some things that I had.
[00:26:21] I had done, basically what I'm describing. I was trying to, to make the case for something that was very easy to shoot holes in and actually like pretty, if people were to actually do what I was suggesting, it would be pretty bad. And, it was actually, it was about reforming the prison system, which was.
[00:26:37] And the industry that I have very little experience with, and as I was talking about how to use technology to reform the prison system, anyway, so I ended up writing this email, acknowledging that, you know, that he was right about a bunch of things and then telling him about a couple of things that he did get wrong in terms of he referenced an old study and there was a newer study and then just told him like, Hey, I meant well.
[00:26:57] And I think that you meant well, too. So I would love to have like a deeper conversation about this. And I was kind of hoping that he would like retract it, which he didn't. And though that was me maybe being manipulative, but his response was so awesome. He wrote back and said, no one ever does this, no one ever writes me back after I tear about how
[00:27:16] I heard them apart. Thank you for being thoughtful. I will totally correct the thing that, you know, the outdated study. And I'm really sorry that, I was a little bit ad hominem and I may correct that. But, you know, would you like to come to my class sometime? And we can actually talk in front of students about this very issue
[00:27:33] I think that would be really valuable for them. And I was blown away by just how that was not the response I thought I would get from just the way he had written this article. And so we had a really nice back and forth. I ended up revising my posts and acknowledging a lot of the things that I gotten wrong and it ended up being a very positive experience, but then the cool part of this story.
[00:27:55] So there's a sort of a long story, but the cool part of this story is, a few weeks later I was working on a post that I knew was going to be controversial. And so I had some, you know, my usual friends read it beforehand, give me their notes. And then I thought about who is it that's going to be the most, this going to hate this post, the most?
[00:28:13] Oh, it's that guy. So I emailed him and I said, Hey, would you read the preview of this post I'm going to do? And tell me everything that you could find wrong with it, everything you disagree with, poke holes in it. And he wrote back with pleasure, like two word email, and then, you know, a couple of days later, he wrote back this huge, enormous, thoughtful, analysis of my posts.
[00:28:32] And I used it to make the posts better. And then the coolest thing was about a week later, he emailed me with the same ask. Cause I know I'm about to publish this thing. Would you critique it? I know we don't see things the same. And so we kind of turned each other into this, you know, like an enemy into an ally.
[00:28:50] And I love having someone like that. Actually now I I've tried to do that with other people too, having people who I know that I can invite to try to help me disprove my own thinking and by inviting that it makes it not so bad. it hurts less. It's now part of the game and it makes it not this adversarial relationship.
[00:29:08] It's about like both of us now, our goal is to get to the truth and to understand things better than before. So it gets at what we're talking about, but that's, I think one of the best lessons that I've personally been through in the last few years that applies directly to this and me being kind of a piece of garbage about it at first.
[00:29:26] And then luckily this guy was a nice guy and wanted to be collaborative rather than just, you know, the trash talker.
[00:29:35] Jess: You know, it reminds me of that Abraham Lincoln quote, which I'm sure I'll misquote, but if he says something like, you know, do I not destroy my enemies when I turn them into my friends,
[00:29:46] Shane: I love that.
[00:29:47] Oh, wow. Yeah. I really like that. There's a Martin Luther King quote that I came across. I've been reading a lot of his stuff lately, just trying to, you know, as I think about what's going on in the world right now, he has a quote that's similar. He says the only way to turn an enemy into a friend is through love.
[00:30:04] And I really liked that too, that both of them are, you know, they're different angles on this idea that, we can like, yeah, we can turn anyone into our teammates if, if we think of it that way. So,
[00:30:20] Jess: you know, I watched a movie this month that it didn't get much love from the critics, but I really enjoyed, it was a Forest Whitaker playing Desmond Tutu, and him working with one of the guys who'd been on the killing squads.
[00:30:36] Played by Eric Bana. And it's rough, it's rough movie. But by the end of it, There is this massive transformation it's based on a true story and, and it's not like Pollyanna, but it is just like, man, it gives you optimism for humanity that no one has permanently broken. No one has permanently racist, no one has permanently anything.
[00:31:02]and yet it's that love and patience and conquer conquering of self on the part of Desmond Tutu that just didn't give him anything to fight against, even though he was trying to provoke him so bad over and over and over, and he ends up saving the life of this black kid in prison who one of the black gangs is going to kill.
[00:31:19] And, the guy ends up. I want to say it anyways. It's a really great, really great man.
[00:31:27] Shane: What's it called?
[00:31:31] Jess: Of course. I can't remember. You talk about the next thing and I'll look at the movie.
[00:31:34] Shane: Okay. But I, I, you know, people like that, I mean, Desmond Tutu is a really good example. The most transformative types of leaders in history tend to be the ones that know that they need to change too.
[00:31:49] Right. And that are willing to admit that they're wrong. You know, it makes it just this subject matter. It makes me think of Nelson Mandela. He changed so much during prison. And he admitted that and he was willing to do that. And that's an impart, you know, if Nelson Mandela had gone with his first hypothesis on the way to correct the injustice of apartheid in his country, he probably would have died in some sort of a violent conflict, but he did change his hypothesis after a lot of work and a lot of anguish and humility.
[00:32:23]and was able to, you know, to heal a really, really toxic situation and get a lot of people who are having a hard time, you know, with the idea of forgiveness to actually truly exercise it. And that's, you know, I think a lot of this stuff that we're talking about boils down to intellectual humility, or if there's one meta skill that can make you better at this stuff, but it's so much easier to let go of a, of your first.
[00:32:49] Hypothesis or to take the fact that you're not right about everything or to adapt. If you see intellectual humility as a strength, rather than as a weakness, like, Oh, I have to change. Oh, I used to be wrong. And that's okay. Recognizing that as a strength where, you know, too many of our models of leaders are, you know, they think that they have to be right and they have to be stoic and they have to be firm, in order to, to be respected as a leader.
[00:33:16] But that's not how we, you know, once again, you don't change the game by playing the game harder or, you know, being tougher in the same game, you change the game by changing the game.
[00:33:26] Jess: So the movie is called The Forgiven. I highly recommend it. I think it's on Amazon prime still. But, you know, I think about this process and one, maybe one bit of advice I would say is I used to get in so many arguments with my brother who we we've been, you know, we ran investment funds together.
[00:33:42] We've run our charity for decade together, all these things. Right. And both he and my wife are incredible at spotting the holes in situation, but you just started telling something. They don't have to work at it. They already know the problems. Right. And I was so frustrated so often and what we ended up doing is my other partner, John, we, he and I hash things out endlessly and we explore together,
[00:34:06] like we go wide together and then once we have beat it up enough, then we bring it to Nick. And, you know, he's like my best friend in the world, but he, you know what I mean? Like he, can kill things too early when I'm not ready for that yet. And he is also annoyed to death when I'm like throwing out 50 million ideas.
[00:34:26] He just wants to know what we're doing next. Right. And so I've stopped, annoying him. He stopped annoying me by sequencing those of John and I get to a place where we feel like it's somewhat defensible and we're ready for that. You know, and then we go through with Nick and then we start going to friends and then we start going to like easy investors who aren't going to rip us to shreds and never talk to us again.
[00:34:50] They're willing to give us a second chance, you know, try to bring some curiosity. And then eventually through all those iterations, we can work up to like the do or die. This is your one and only meeting you'll ever get with this guy. If it doesn't go well.
[00:35:01] Shane: Right. And you're ready for that. Yeah. So anyone who's, who's paying close attention to this, this series we'll recognize that, you know, in the last episode we talked about expanding the pool of ideas and then whittling the pool of ideas.
[00:35:14] We're talking about hypothesis and sequencing that, and, you know, really goes hand in hand with the, you know, the testing of the hypothesis thing. This pulling in the right people who are going to help you do a better job at, you know, the expanding that pool of hypothesis is very much, it's making the case as the induction thing, but then preventing yourself from making the leap to this
[00:35:37]you know, because we built the case, this is now true in fact, and actually going then to the step of deduction of let's chop this down, and let's include your business partner or your wife or whoever the right people are. It's perfect. And also when you're saying leads to the next main idea of this stage of, you know, the lateral thinking process, which is, experimentation is, I mean, this is where a lot of people don't understand, like the lean startup thing you want to do rapid experiments.
[00:36:08] You want to get feedback really fast in Smartcuts. In the third chapter, I dig into this idea that, you know, doing lots of experiments and failing fast and failing often is really important, but that's the cliché. The reality is there are some things that are catastrophic to fail ads and you don't want the experiment to be, at that catastrophic stakes level.
[00:36:34] So by the time you have the do or die shot to pitch an investor, you want to make sure that, you know, that the probability is really good, but you're not going to explode. I mean, the analogy I like to use is that failing fast and failing often is fine so you're landing on an aircraft carrier. In which case you cannot fail at that point, the experimentation that needs to be done to prove that you can land on the area
[00:36:57] craft carrier is the flight simulator. So, you know, a lot of this stage of disproving our hypothesis is about building little flight simulators for ourselves so that we can figure out what not to do, so that we can land the plane when it, when it comes to it. So, you know, what you're saying is, is the perfect segue to this idea that you want to find the fastest cheapest, lowest stakes ways to disprove the things that aren't going to work.
[00:37:24] Before you move on to the higher stakes things that require, you know, more intense experiments. So they're doing a science experiment. You want to eliminate the hypotheses that are just easy to eliminate first before you out on a limb and publish them in a paper, or, you know, you put the parachute instead of on the army guy, you put it on yourself and jump off the roof of the house.
[00:37:46] You want to do whatever you can to cheaply quickly. And low stakes, do experiments to get rapid feedback so that you can eventually narrow down to the thing that you're willing to put a bet on. So, This is where, you know, there's a lot, that's out there already on lean startup and, you know, rapid prototyping and you know, you're making a website, so you draw it on a piece of paper and you test people on the piece of paper before you actually code up the website.
[00:38:14] It's a lot of stuff out there already that I would say I'm not the expert on those specifics. But I do think it would be interesting. For us to talk a little bit about the, just the idea of thought experiments. So before you even make a prototype, how can you eliminate, or how can you poke holes in ideas or use this sort of debate process to call away and you know, deduce away some of the worst ideas, hypothesis that you have.
[00:38:42]so, you know, thought experiments or one of these areas where if you Google, how do you do a thing experiment? You're not going to find much because it's by definition, sort of a custom process. But I'm curious when you think about thought experiments or when you've sort of countered the idea of thought experiments, like what, what comes to mind?
[00:39:06] Like how do you think about that when it comes up?
[00:39:09] Jess: Well, and this is, you know, I have to give credit to you because I feel like it's really been spurred on by my study of your work that, you know, as you've talked about, Einstein's thought experiments while he's sitting there as a patent officer, you know, and, and some of the things that he accomplished, without much fanfare by just working through it mentally, but rigorously, you know, it's funny
[00:39:32] we did, we did one this week. Like we've been getting ready. You know, we've been working on this investment fund for a year and a half, right. And we've been actively pushing on a heart in the last few months. And we built this like big frequently asked questions list to try and answer investor questions before they even start kind of a thing.
[00:39:50] Right. And this week we came up with a way better questions because we just reversed the role. And I was like, man, this can get better because I can tell it was contaminated with sales, our answers we're currently going to build trust and we would try not to, but we just, it just so tempting to flip into making the case, you know?
[00:40:11] And so we did a thing this week where we said, okay, let's say that this is as successful as we hope it is. We've got all this money. We don't want to work hard anymore. And some other young bucks come to us with. With the fact that they want startup cash for their advisor, their general partner to start a fund.
[00:40:33] What are we going to ask those guys? Man, did we come up with different questions and really quick, like in a space of 10, 12 minutes, we had come up with like significantly more questions and, where we thought we'd exhausted after six or eight weeks of coming up with questions.
[00:40:50] Shane: That's awesome. Yeah. So the, I mean, it gets at exactly what my sort of theory on thought experiments is, which is that, you know, in general thought experiment is conducting an experiment without doing anything right.
[00:41:02] It's thinking through all of the possibilities and, and you know, sort of doing a possibility of elimination until you have to do a real experiment. But really what that consists of is asking a lot of questions. And, you know, if you, if you listened to the episode that we talked about, about asking the right questions and second order effects, you know, a lot of that episode really was about, the big picture, making sure that you're going after the right thing, by thinking through the ripple effects.
[00:41:29] But the ability to come up with questions to attack, you know, a hypothetical is really what a thought experiment is about. I love your example is so good. there's a lot of psychology research that, that points to basically the fact that we're really good at coming up with hypothetical answers for things.
[00:41:50] And we're actually really good at coming up with more than we think we are if we're pressed too. So, I'm going to forget that who did that, this experiment, but one famous psychology experiment is, the research just took a bunch of groups of college students as usual, and they had them write down arguments for
[00:42:09] divisive issues like abortion rights and, you know, gun ownership and things like that. And then they had them write arguments against that, and then whichever one, and they asked them, you know, which side do you fall on? And what they found is that most people came up with more arguments for the side that they fell on then against the side that they fell on.
[00:42:33] But then what they did is they then told everyone you have another 30 minutes. To come up with ideas that go against what you think, you know, what you prefer. And everyone was able to come up with more, a longer list than their lists that confirmed what they wanted. And I'm sure they could come up with more ideas given enough time
[00:42:53] on the other side of the list. The point is that if we force ourselves to we're really good at this, the human imagination is, is really good and often our biases and, you know, what we want to be true, kind of gets in the way of us doing what you guys did. When you flip the scenario, you force yourself to think, from someone at some other point of view and you were able to then do so.
[00:43:18] So, so that's basically what that experiments are, is it's you have an observation, you have a question, you have a hypothesis. And then you ask yourself, how would we disprove this hypothesis? What are all the questions we can ask to get at this? And some people swear by, you know, diagramming things out.
[00:43:35] I'm a pretty visual thinker. So I'll often. You know, kind of do these well, if this happens, then this would happen sort of thing. But with anything, we actually have a lot of capacity to play with things before we even have to go run a real experiment. So one that I was thinking of today to use it as an example, that that's a new one to me because I encountered this question on the internet recently that I think can sort of illustrate this.
[00:44:01] Is if, say you make the observation, you look in your freezer and you notice that the ice cubes in the ice cube tray are white in the middle, which my ice cubes are, I check the freezer. They are, if, this is not a high stakes problem, but let's say that you make that observation. And then you ask the question, why are ice cubes white in the middle?
[00:44:22] So you can come up with all sorts of hypotheses, but you know, hypothesis that I came up with sort of initially after thinking about it for a while, is that, well, water is actually not that see-through, when it's frozen and, you know, in the middle, it's the most frozen or it's the most stick in the middle.
[00:44:38] And so that's when it becomes more, twos upon clear that water is not so see-through. so at that point I have. You know, the scientific method we're at this point. Well, how would I disprove that hypothesis? First of all, could come up with other hypothesis. But if we wanted to disprove that, how would we do that?
[00:44:56] Well, we'd break it apart by asking questions. What's an ice cube made of, made of water, but what kind of water is it? Pure water, is that, you know, are the ice cubes in my freezer or they tap water or they distilled water, are they purified water? What else is in water? and it is all that stuff clear. does do ice cubes have only liquid in them?
[00:45:16] Do they have gases in them too? what, how does an ice cube freeze would be another question? Does it freeze all the way through all at once is actually freeze from the outside in? Asking these questions helps me to sort of tease it apart to then say, well, you know, the hypothesis that an ice cube is
[00:45:36] white in the middle because that's where it's the most dense, the most frozen actually doesn't hold up because I know that ice cubes, ice water freezes from the outside in. If you ever see a frozen pond underneath the surface, it's still liquid water. It's the surface that freezes first, it takes a long time to freeze all the way through.
[00:45:55] So the middle of the ice cube actually is the least frozen. Part of the ice cube. So even just playing with that, it's sort of like a, a little silly thought experiment. But by asking those questions, I realized that my hypothesis, that, you know, the ice cube is white in the middle because when it's frozen
[00:46:13] the thickest part, you know, is white. That doesn't quite make sense. But, you know, in asking those questions come up with perhaps a better hypothesis, which is that, there's other stuff in water. And as water freezes, maybe the other stuff like the minerals and gases or whatever, whatever's in water.
[00:46:32] There's something else in water. I know this cause you purify something out of the water. So the stuff that's in the water, maybe as water freezes on the outside, that stuff gets pushed to the middle. So even just going through that, I've basically figured out like a really good. not answer to this question and actually a pretty much better hypothesis to this, that at a certain point, I'm going to have to do an experiment.
[00:46:56] I have to chop an ice cube open to see, if, you know, if it's just the thickest part of the ice cube, that's you know, that's white, or if you chop an ice cube in half and you notice that it's white along the chopped up edge, then okay. It's not the, you know, how far you have to see through it. That's the answer.
[00:47:12]and so then, you know, that's a new observation and we can go through this whole process again, and I looked it up and it's actually true that the substances in, in the water are what make it, tend to be white in the middle. But another thing that we could do is we could say, well, let's get purified water versus tap water, make ice cubes out of them.
[00:47:30] Or I could even, we could even not have to do the whole experiment. I could say, have I ever seen ice cubes that are clear all the way through? Yes, I have. I've seen it in a bar, like those fancy round ice cubes that they put in an old fashioned. So whatever. So those aren't white in the middle. So that must be all right.
[00:47:45] It's not about the thickness of the ice. It's something else. And yeah, I looked it up and it turns out it's about the particulates inside of the water that we get out of our tap that you should make that happen.
[00:47:56] Jess: So, I love how thorough you've gotten into that. And I think for me sometimes, sometimes when I hear people as smart as you go into all this, I then sometimes struggle with, but how does that apply to my business?
[00:48:10] Or how do I for translating that? Can you think of any, can you think of anything in the history of Contently, for instance, as you're growing your tech company, that, that you ended up running through this process? Is there any. That or some other example that occurs to you.
[00:48:25] Shane: Yeah. so there's a couple of times when we did, we had a, you know, some things that we were trying to do.
[00:48:37] I mean, I can think of two that I thought of earlier today actually, about this subject, one that worked out well and one that worked out not so well. That we're basically, we made a leap of induction in one case, rather than, than thinking through everything. So the scenario is, we Contently was trying to basically improve its cash position.
[00:49:03] And, you know, this is a year and a half ago trying to improve our cash position. And I was on the outside at this time, but I heard about all of the fallout. And then was loosely part of the, you know, I was on the team, no change for their thinking processes. They're going through this, but basically part of the model is a businesses pay for software.
[00:49:23] And part of the model is if we get freelancers work, then we make a, a fee that a literary agent would make, which is 15%. And in the history of the company, we toyed with increasing that percent to squeeze a little money out of, out of the equation. And we'd always decided against it. for various reasons.
[00:49:42] And the, the thing that was proposed was that we could add a fee for when free, instead of increasing the percentage from 15 to 20, we could add a fee for when freelancers cash, their money out of Contently. So you made a thousand dollars. If you want to deposit it in your PayPal, we charge you a fee to do that.
[00:50:01] And what we should have done well. So a lot of people I can tell they were not, did not think that this was a good idea. And they thought that, you know, that our freelancers would be mad that they were being double dipped. but, well, we should have asked is why not? What, what could go wrong?
[00:50:17] With this scenario before actually the, basically what was decided is let's test it and see what happens. Let's go to the real experiment. Let's announce this change. And, and, and basically if people freak out, then we'll readdress it. But, that turned out to be a really stupid idea because people did freak out and it was like the biggest PR disaster in the company's history.
[00:50:39]because basically what it amounted to was we were holding freelance writers and editors and photographers money for ransom. It's like, if the bank says you can't go to the bank and take your money out without paying us a fee, it's not like the same as an ATM. It's not a convenience thing. It's like, you literally can't get your money without paying us money.
[00:50:57]so people really flipped out about that. Articles are written in that scenes about how Contently used to be friendly for freelancers, and now it was bad. And we ended up reversing that fee and making apologies and it went away and it was fine, but it really could have been avoided. We gone through a little bit more of a, a thought experiment around this.
[00:51:18] If we had asked, or I don't know, how do I say this without throwing anyone under the bus? If certain leaders had been willing to hear people when they asked, when they said this isn't gonna work, if they would have been willing to ask why not, why won't it work? And actually think through the scenarios, you know, the answer would have been, so if you say, my hypothesis is that adding this cash out fee
[00:51:43] will help us improve our cash position and it will work out. And people say it won't work. And then if you say, well, why won't it work? What could go wrong? And then if we modeled out the potential scenarios very quickly, it would come up that our top earning users who happen to have very big following.
[00:52:05] So the top users of Contently are influential. Writers and journalists who a lot of people follow, who can write for big magazines. And a lot of people pay attention to, they will complain to large audiences about this, or it's quite likely that that happens. And if that happens, the second order effects of this may outweigh the amount of money we make from this cash out thing.
[00:52:28]So what could go wrong was actually pretty easy to sort of think through that. And then there's a bit of a probability thing, cause you can't know for sure you can't deduce without getting the data. But what we did is we skipped right to getting the data and it ended up being quite the disaster.
[00:52:44] On the flip side there, you know, there was a time you know, not, not so long ago, a couple of years ago when we realized that a lot of our clients were having a hard time figuring out what to do with content. So they use Contently for content marketing. They use the software, they use the freelancers, but a lot of clients weren't, weren't sure what the right strategy was for this cause it's kind of new industry.
[00:53:06] And so, you know, we came up with the hypothesis that, adding strategy services. To actually come up with plans for our clients would be good for our business. Basically help them stay around longer, have them spend more money with us, help them make their customers happier. And so the question that we actually debated for a long time was why wouldn't that work?
[00:53:30] And there was some healthy debate, you know, a lot of the people I can tell, I said, that's not gonna work. And we said, why not? And ultimately the answer to that was like the, the most common answer from our investors, especially was it won't work because it will make us an agency. And when we finally looked at that and said, well, what does that mean?
[00:53:50] First of all, is that true? Is that software company, you know, diff like a label is a, like, you're an agency because you have software and agency services, fine. Whether or not that's true. The real thing is that the same as this not working? Is the label being put on us the same as it not providing value for our customers, not helping our customers stick around longer.
[00:54:14] And so, you know, we went through a lot of the thinking behind this plan to add strategy services. Well, before we added it. And then when we added it in a small way with a few customers, limited experiment, get feedback with customers that we knew were not going to freak out and cancel.
[00:54:32] If it didn't go very well. you know, didn't charge a lot of money and then we rolled it out slowly. So that by the time it's a big rollout and we can take on these big seven figure strategy deals. We know that we're going to land the airplane safely on the aircraft carrier. So, you know, it's, it's not like the cleanest set of examples, but it speaks to the two ways that this can go about.
[00:54:52] You fail to, you know, to think through the thought experiment and you skip straight to the experiment that can be pretty bad for your business. You can also spend way too much time on the thought experiment before you get to doing real experiments. But it was that thought experiment of saying, well, why not?
[00:55:07] And why over and over and over again, and asking all of these questions that led us to the conclusion that it's not going to be such a bad thing if people think that we have agency services, and if our investors are pissed that there's an agency arm to this company that then makes the customers happier and makes more money than so what, so this actually, you know, cause a lot with, when you go to implement lateral thinking, often the thing that we end up being afraid of is that what people will call us or people will think.
[00:55:34] And, you know, the, the example of Genghis Kahn that I used a time or two in that, you know, series is, you know, Gaviscon was, was very innovative in the way that he developed strategies for taking over the world. And one of the things that, that he did was basically, there's a certain way you fought and his hypothesis was that that was stupid, basically.
[00:55:57] And he came up with these plans to basically attack a bigger army and then retreats and let the army chase you and break ranks and fall into your trap. And everyone that you know, was with him, all his advisors, it's not going to work. That's a bad hypothesis. And he said, well, why not?
[00:56:16] What's going to happen if we do that. And they said, we'll be called cowards. And he said, well, it's being called cowards. The same thing is not winning. Who cares? They call us cowards. Cause like let's model this up. We go against, you know, these guys with the huge fortress and this enormous army.
[00:56:35] And then we start running away and make them think that they're beating us. So they open the gates. They run out. They break ranks. They fall in the ditch with the sharp sticks that we set up, this covered in grass and they all die. And then we go and we walk through the open gates that they left open and we win.
[00:56:50] And there's no one left to call us cowers. But even if they do call us cowards, who cares? So it's sort of this war strategy of caring about the right things like when you model out what could happen, you can, you can realize that this thought experiment, that some of the things that you're worried about, don't actually matter that much.
[00:57:07] Alright, long answer to your question there, but I hope that, that makes us a little bit more concrete.
[00:57:15] Jess: No, it does. you know, I'm just thinking here on all the things that we've covered today already, how would you kind of sum it up? What, how would you encapsulate that?
[00:57:28] Shane: The summary would be that before. I mean, I really think from a business standpoint, before you go and implement your ideas, because you're excited about them, you've come up with something, using lateral thinking it's fresh, it's innovative before you go and implement it. You want to make very sure that it's going to work.
[00:57:48] And you do that by taking the, your side, whether we're calling it deduction or we're calling it in version, you try to poke holes in your argument or in your hypothesis before you go and bring it to market, or you go and you even run your, your lean startup thing. That's how I'd summarize it is you need to make part of the game to be, how do we break this thing so that you don't go out with something that's going to break?
[00:58:16] Jess: Well, I've got a few questions and it's okay if you don't have answers, cause I'm putting you on the spot. But, is there anybody that comes to mind right off the bat, as you see, as being good at this or someone you look up to or someone you think models that well.
[00:58:29] Shane: So
[00:58:30] Jess: or an organization.
[00:58:31] Shane: Yeah. I'm trying to think of a really clear example that the ones that come to I'm hesitating
[00:58:40] cause the ones that come to mind are like the cliché answers. So I do think that Elon Musk is really good at the sort of thing. I think, he's good at generally using the scientific method. As part of his thinking process, including actually. He, I just read an interview with him where he talked about, how he's made a lot of mistakes by running his mouth on Twitter, which is, you know, he's gotten himself in trouble on Twitter.
[00:59:07] He was also, he's one of those leaders that really understands the power of speaking directly to people, you know, getting your own message out. He's doing his own content rather than relying only on the press and their interpretation of what he's up to, you know, a lot of people are realize that that's very effective.
[00:59:23] But it also can, you know, can get you in trouble. And he's in this recent interview I read of his, he talks about how, he's made mistakes and adapted his strategy accordingly that he's, you know, he recognizes the things that he's done wrong, which it gets back to this intellectual humility thing.
[00:59:41] People don't normally think of Elon Musk as a humble guy, I think in part, because he's incredibly smart and knows he's smart. But he is good at changing course when it's the smart thing to do, he is really devoted to pursuing the right path, even if it means doubling back or being seen, changing your mind or running out of money and having to raise more money.
[01:00:06]he is very, I think in most cases, his personal ego is left out of the decision making process. Not saying he doesn't have a big ego, he likes to be in front of the cameras. He likes to be celebrated for what he does. He likes to be recognized that he's smart, but the decisions he makes aren't about making himself look good.
[01:00:28] Those decisions he makes are about getting to the right answer. And so, he is, I think really good, especially because it starts with the first principles thinking at modeling out the thought experiments, you know, asking the right questions, coming up with different ideas. But before he raises all the money before he starts building the rockets, he spends a lot of time doing his homework and thinking through things that would go wrong.
[01:00:54] And in part, maybe he's on my mind and why I'm hesitant to use him because I was talking about him is because this last Saturday. You know, SpaceX, just put the two guys in space and now, you know, the space station, first private company to build a non-government rocket to do that. So it's awesome. And way cheaper than NASA has ever done all of that.
[01:01:15]and, you know, or you build a billion dollar rocket, you want to make sure that it's going to work. And so, you know, he's in the car kinds of industries that, you know, industrial industries. That really require you to do a lot of thinking upfront before you do experiments, but even then, you know, a few SpaceX rockets have blown up on the launchpad, you know, thank God before anyone has been in them.
[01:01:38] You know, it's taken them 15 years, I think to finally put someone in a human being in one of the rockets, because they're very careful with their experiments, but they, they do this escalating series of, experiments. And if, if you read his biography or if you've read any of this or seen any of the interviews about his process for starting SpaceX in particular. He did a lot of homework on physics and on supply chains.
[01:02:01] And he modeled out a lot of different ways that, that this could go wrong before he decided that he was going to invest in his life into it. And so I think some of the most fascinating stuff that he's doing now that is purely in the realm of thought experiments is his neuro link stuff.
[01:02:18] I don't know if you've heard much about this, but, you know, microchipping our brains or uploading our brains or whatever, whatever it is that he's up to there, like this idea of, you know, and the human brain, having some sort of a merger or an interface is the stuff of science fiction, but he's paving, you know, a lot of really interesting paths for technologies through actually a very public
[01:02:47] discussion and thought experiment about a lot of that stuff. So it's, I don't know enough about it to really dig into the nuances, but, you know, I've read enough to be dangerous, but he actually is know part of what I think he does. There's some subtlety to it that he does that a lot of other entrepreneurs do, but slightly different than him and slightly wrong is as entrepreneurs we're often given the advice to, you know, to paint the vision and to, and the hype
[01:03:14] of the dream of where we could go and then people will invest in they'll support us and that story will get us the support we need. And I've written about this a lot. but what a lot of entrepreneurs do is they paint that vision without actually having a very good, well thought out plan on paper for how to get there and everything that could go wrong.
[01:03:38] So I was talking to someone. I think literally yesterday, tell me that this wasn't you, I could be getting this wrong. I was talking to someone that said that, that there's actually a very thin line between Elizabeth Holmes and Elon Musk. That wasn't you, was it okay? Yeah, no, it was a few days ago when we talked, I forget who it was that said this, and I think it's actually a really interesting thing to think about.
[01:04:02] So Elizabeth Holmes is the one who had the blood test. There are no it's the blood testing startup that promise that with a click of a finger, you could do all of this great blood testing work that normally requires like a larger sample and a blood drop. And you know, she built this house of cards, basically that ended up not having real technology that could work and ended up defrauding a lot of people.
[01:04:23] And I believe she's in prison or she's on her way to prison. The court case has taken a long time, but there's documentaries now about how this was the great fraud of this visionary entrepreneur. And Elon Musk has painted this picture of, you know, the entire world's running on solar energy and getting human beings on Mars to make the human race multiplanetary and all of these really, really huge visions of what his different companies and enterprises will do.
[01:04:50] And he's used that vision and those stories to get people on board, to invest in him, to pay attention to him, to work for him, to deal with what sounds like a very intense work environment. And, but each of those plans he has, has been the product of really intense scrutiny and really intense thought experiments before money has been spent.
[01:05:13] So the, the plan to get to Mars, he's done this thing that we're talking about today around inversion, where he has backed up from, we have now colonize, MarstTo all of the steps to, you know, to now the Delta between we, you know, we haven't put any human into the space from the United States since 2012, too.
[01:05:37] We have a colony on Mars that actually people can survive is a huge Delta. And he built a strategy from reverse of like we're on Mars to now all the steps that it would take based on first principles, based on all of this, you know, intense thinking for how we're going to get there and this sort of ladder.
[01:05:55] And really, this is what we've been talking about, building your own different ladder, your own path to, to your goals and your dreams. That's what smart cuts and lateral thinking is all about. Elon Musk has charted all of that out. And as new things arise, you know, you have to adapt. He recharged out, what needs to be done instead of sticking to the plan.
[01:06:14] That's not going to work. He pivots and he adapts, cause it's this ongoing process of the scientific method, but, you know, with their analysis, there was the dream. And I believe that Elizabeth Holmes really actually did believe that she and the company could get to this amazing technology, but there wasn't anything real for getting there.
[01:06:34] Therenwasn't even, you know, the proper thinking that was needed to get there. And so then it became this Ponzi scheme of raising investor money and instead of, actually getting to a real solution. I think, you know, to be fair to Elizabeth Holmes, which I think is a sentence, few people have said, she has been subject to a lot of, I think, bias because she's a woman because people, you know, didn't want to give her a benefit of the doubt, but she also defrauded a lot of people and didn't do the hard thinking that was required to, to bridge that gap between the big vision and, you know, andnthe observation that there's something that needs to be done here.
[01:07:13] You know, the question of how do we get there? So, you know, this is where as I rambled to answer your question, I think, you know, this process is not as linear and not as straightforward as it may sound. When you say the scientific method is these, you know, these steps because they feed into each other, you know, you get to the disprove, the hypothesis stage, and you need questions.
[01:07:41] You need observations in order to try to disprove the hypothesis and feed that. And that, you know, it's that fractal tree that I talked about. So it is a complex process and I do think that's why it's hard and that's why it's worth it is because too many of us decide that that complex process is too hard to think through, you know, even subconsciously we decide that.
[01:08:04] And so then we build the case for what we want to be true. And we go with that and we hope it's true. We take the risk. Rather than de-risking by, you know, poking holes and going through this complicated process. So it all swirls together. But I do think that, you know, as cliche as it sounds, there's very, and there's no better example of someone who does this well as Elon Musk.
[01:08:28] Jess: I love it. We'll try to bring it back to the personal application side. you know, So I'll use myself again. We want to try to do something completely different that hasn't been done in our space before. Right. And we want to try and go big with it. You know, there's lots of other multibillion dollar REITs, and we'd like to be one as well
[01:08:52]but not do it the competitive way. Like everyone else, you know, you look at something like mutual funds and 40 act kind of investment funds. It's like 3% of the funds out there have 80% of the assets, you know, that's a pretty big bar to be like, we're better than 97% of all the firms out there.
[01:09:13] Shane: Right.
[01:09:15]Jess: and it's just not a high probability process. Right. And so we looked at, okay, well, you know what, if we follow Jerry Garcia and instead of like, you know, trying to be the best we try to be the only, you know, he's not trying to make, not having a grateful dead make money off record sales, it's off concerts and all this stuff back 30 years ago, when that wasn't, it wasn't the way it's done.
[01:09:34] Right. So looking at this idea of like a premise, Hey, if we build our own media company, like a mini Bloomberg, or mini red bull or whatever, so we can give ourselves free advertising and, and, we intentionally, like don't make it just a transactional sale of our investment, right? We're going to create training programs.
[01:09:57] We probably could charge tens of thousands of dollars for like we have in our previous consulting life. And we're going to give it away for free to entrepreneurs, trying to help them make enough money so they can afford to buy passive income from us. Right. And we've got a bunch of experience, my one partner is on commercial real estate for 30 years.
[01:10:13] We've been building you know, investment sales teams for 15 years, we've been doing media stuff, you know, this show working with Forbes and Bloomberg and yet what we are trying to do, hasn't been done before, right? I mean, in many ways it's like, you know, Southwest airlines, even though they're trying to compete with driving, instead of competing with airlines, it has been done, but it hasn't been done, you know, like what we're doing has been done, but it hasn't been done.
[01:10:42] Anyways, I'm kind of rambling here, but my point is thinking about this idea of we don't want to just have an investment fund, we want to be a large multibillion dollar one, but like in any recombining elements, in a way that people haven't in the past. How would you take anything we've talked about in the last six episodes, but especially what we've talked about today, what as an outside consultant or something, what are some of the things you would encourage us to do to really stress test that and think harder about it and anything that we already have?
[01:11:19] Shane: Yeah. So I, I mean, I know for a fact that you've gone through a really rigorous process of lateral thinking to get to this hypothesis, you know, cause we've talked about it. and I can't say that I'm, I know enough about the industry to get into specifics, but that's not, I think that's not what you're asking for.
[01:11:38] Anyway, I think specifically around this episode, what I would do is at this juncture where you, you have a hypothesis that you think is good, you've done a lot of work on it, you're what you're trying to do now is increase the probability that this or whatever you end up doing is going to work and get you to your goal.
[01:12:00] And so if you want to do that, then now this current incarnation of, you know, of your idea, I think you want to play this game of how could this go wrong? and then, and it sounds like you kind of already have, but I think this is where, you know, before you start investing anything, I think you want to really think through the scenarios of, what would make this not work out?
[01:12:24] Cause it's one of the things that happens is the more time you spend on something, you know, you'll end up having this bias of, similar to like the sunk cost thing. But it's like, I've put so much energy and effort into this that I want it to work. And so I'm willing to see what confirms that and I'm willing to overlook what doesn't confirm that.
[01:12:43] I mean, you really want to play this game of, of what would kill this. What would make this not work out? What would make this, not work as well? And I would take aspects of it. So I have a little list of questions that I think are useful for trying to determine if you have any biases built into your hypothesis.
[01:13:00] You know, you're trying to do this kind of thought experiment thing. I would ask yourself questions about aspects of this. I think you have a really solid foundation for a lot of, you know, solid premise you're working from solid principles. But there's, you know, it's not like one thing there's a lot of facets to what you're trying to do.
[01:13:18] I would take a look at those and try to, you know, and ask yourself some of these questions from this list that I like, which is, is this the first idea that I had? Is this the first thing that popped into my head? Anything that answers yes to that, we can extra hard look at, because that, that will play into your biases because, yeah, that's just how human nature works.
[01:13:40] We really like when it's just our pure who we are, that they came up with the right answer. So anything that, that you could say yes to, I would take an extra look at that and I would bring in you know, and start to loop in people who you can call your extended team, your critics or people who, you know, could help you see what you don't see, or maybe, be
[01:14:02] a naysayer to you and you don't have to get dispirited, whatever they say, you know, take what they say and put it through a filter, but factor that in. And then I I'd also say in based on how you're describing it, I think this is not quite the case, but, you know, one of the questions I ask is, is there an interesting story about it that, you know, so am I doing this because of the story of how Mike Bloomberg bill Bloomberg, am I doing this because of this past anecdote of something that gets me excited about it?
[01:14:32] If that's the case you have, once again, a really good way to persuade people. That what you're going to do works, but it's not a good way to persuade yourself that this is going to work because there are different things about what you're doing than what other people have done. Another question is, did the idea for this come from someone or something that you like or dislike?
[01:14:53] So, you know, got the idea for this media company that supports this, you know, investment funds from Shane who I really liked. Shane's the one who gave me the idea. Do I, I like Shane and that's why I liked this idea. Like, ask yourself that kind of question. And I'm not the one who gave you this idea, but you know what I'm getting at.
[01:15:12]and then the other two questions are, would it be really convenient if all of the assumptions that we're making work true? Would it be really convenient? if it would be too convenient, then you want to take a harder look at it. And does this line up with my own incentives? Is there a personal incentive
[01:15:30]for me for this to be true? Like, is there a story that you could tell of, Hey, I've been talking about this, just for example, I've been talking about this idea on my podcast, so if the idea turns out to be true, it would make me look good for having said that on my podcast and it would make me look bad
[01:15:48] If I have to pivot. If that's the case, then you want to extra think hard about what could possibly go wrong. So that's I know that's, that's not too specific about your situation, and what you're trying to think through,
[01:16:02] Jess: actually super helpful.
[01:16:03] Shane: But I, so example from something I'm working on, right now, on Monday I had a meeting with, TV producer who is also really smart businessman.
[01:16:15] He's runs a production company for an ALS actor who also invests in and promotes a certain really high end consumer brands, really cool. And so I was talking to him about the head of this production company, about them potentially getting involved in a TV project of mine or giving advice really on how to navigate the CV project, which basically, I can't really explain the details, but basically it's a TV show
[01:16:43] that's going to launch a real life product. So it's like the ultimate case study and yeah. Content marketing. So now that one of the plot lines for one of the characters in this TV show, which is going to be this like grand cameo celebrity actor, that, you know, you know, like in shows when there's like the celebrity, that it's like a movie star that shows up every three episodes to be this cameo character, it's like that sort of thing.
[01:17:05] And there's gonna be a product launch out of something that this character is doing over the course of this TV show. And so we're, it's super cool. Super cool idea that, you know, we're excited about it. You know, our agents, you know, in Hollywood are excited about it. They're pitching big names. So I was talking to this guy about it and, he said, in order for this idea to work, you have to both have an awesome TV show and an awesome product.
[01:17:34] And that is going to be harder. There's, it's hard enough to make a TV show, even when you have great writers and producers and all of that, it's hard enough to make a TV show that is actually awesome and actually remarkable. And it's hard enough as really hard to launch a new consumer product, especially in this category that we're talking about and have it be something that people will remark about.
[01:17:55]And so it's sort of a doubly hard challenge. And so talking through that and made me realize that part of why I've been so excited about this idea is because it would be really convenient if it worked out because it plays into my story as the guy who built the content marketing company and helped, you know, pioneer a lot of the thinking and the content marketing industry doing this next level thing, and television would, there's a personal incentive for me
[01:18:21] for this to work out and it may turn out and the way this conversation went, ironically enough is a, this guy said, you know, even though we do this business stuff, the stage that the products at, would lead me to say, I'm much more excited about producing the TV show and finding you another partner for the business side than I am about the business, which is the opposite of what I thought would happen.
[01:18:43] And I'm glad he kind of like, again, popper theory, he sort of popped my bubble there, but it was really happy afterwards because now I'm looking more clear headedly at some of the potential things that could cause this plan to go wrong. So, you know, we're doing something that's innovative by definition
[01:19:02] it's going to be harder and more risky. So I want to, to know where I'm bringing my own excitement and biases into this so that I can shore up that part of the plan. So we can get the right team together and pull this off. And, you know, fingers are crossed that we've done a lot of work and a lot of agonizing thinking in getting to this point and there's still work to go, but I think if it works, it's going to be awesome.
[01:19:23] And I actually, it's just the TV show works in the product doesn't end up happening, fine. You know, I that's what I really, as I need to be okay with because that itself, we're doing cool stuff with that and it's okay. so being willing to let go of that sort of thing, I think is important.
[01:19:40] So that's, that's sort of my personal story about seeing this kind of come about in my own life and realizing that, if the goal was to be right, then I would have approached that conversation and what he said very differently, I would have spent time trying to convince him that. Yeah, no, you should do this business thing when actually it's not at the stage where that makes sense.
[01:20:03] And there's some more thinking that needs to be done.
[01:20:07] Jess: You know, what's so helpful about all of that is, you know, for this, for essentially the general partner of the fund, which in real estate investment trust is called the advisor, not the GP, Okay. We want what is, is essentially startup capital, right?
[01:20:23] So that we can pay for sales teams and marketing and stuff, to be able to raise the money into the fund itself, you know, and normally that's kind of like a high risk, high reward thing. Right. So what we're doing instead is. We're taking all of that money and we're buying commercial real estate with it.
[01:20:39] And then we're only living off the rent so that even if we go business, our friends still get, they could just have, like, if we get hit by a bus or go out of business, they already own those buildings and they can just cashflow them themselves and sell the buildings and get their money back. So it's kind of like it's our attempt at, Hey, possibly a really huge upside and a downside that is you know, has a reasonable path to at least get your original capital back kind of a thing. Right.
[01:21:04] Shane: That's great.
[01:21:06] Jess: And the big question in this space is can you raise the money? Like the guy who we hired as a CEO is been doing research commercial real estate for 18 years is bought like over $2 billion worth of these buildings.
[01:21:18] Like it's not a, we don't have any factor of, can he buy the right kind of boring, reliable apartment building, right. It's really, can we get anybody to trust us with their money? That's the what if factor, you know, and we, you know, we already have people. We're talking about giving us a million dollars at a time, just for the advisor at this advisor level.
[01:21:38] But yet I have some finance friends who I've talked to about it. And I was, I said I wanted their advice, but really I was hoping them would, they would come back with you, the smartest person I've ever heard of gestures, 5 million bucks. Right.
[01:21:50] Shane: Right.
[01:21:51] Jess: And instead they're like, wow, that seems super capitally inefficient.
[01:21:55] To like, raise so much extra money to buy that collateral. You know, if you really believe in this, why don't you just burn the cash and build the REIT? Like that's way too much work. That's really capitally, inefficient to tie that all up in those buildings. If you don't really need them, you like, if you really believe this, why do you need an insurance policy?
[01:22:12] You know? And I've kind of dismissed them as they just don't get it. Well, they're talking about investing with other people's monies and I'm talking to an entrepreneur who needs to look his wife in the face of going like,
[01:22:23] Shane: Yeah,
[01:22:23] Jess: well, I didn't really risk it that bad, honey, if this doesn't pan out, we still on these buildings.
[01:22:28] Right. And I don't know that I've brought the anyways, just, it makes me think. I actually want to go back to the exact same guys. And now that we are at a place where we're much more confident on what we've got, I do want to go back to them and say, hey, you know, have a look at this. And you, you kind of mentioned this before.
[01:22:49] Can you talk more about that? And more from a place of more intellectual honesty, want their feedback to listen through. Okay. Am I silly or are they just not seeing the vision instead of just being disappointed they didn't think I was a genius. Yeah.
[01:23:07] Shane: Well, and I there's, there's a really good, first of all, I, you know, I'm on board with the idea that I've already added myself as not very savvy when it comes to real estate and investing, but, You know, this brings up something that I think is important is people are going to be wrong.
[01:23:23] You know, a lot, a lot of people are going to think that your idea is crazy until it works and they wished they would have invested earlier. Right. But that doesn't mean that they can't teach you something. And I've found as a writer, the most useful feedback that I can ask people for is what's wrong,
[01:23:41] not what they think I should do. So when I have my friends. You know, or my editors, even certain editors have their own thing, but I usually what I ask when I have people review my writing, I say, mark the parts where you get bored and mark the parts that you don't understand, because that identifies the problems.
[01:23:59] I've done a lot more thinking than they have about how I could potentially solve the problems. And, you know, and I can deliberately consult people if I need to come up with creative answers to things. But when you get feedback on your writing, that's like, Oh, you should change this to this. That's not, not as helpful as, oh, this is a problem.
[01:24:19] And I think same thing with a business idea, like the one that you're talking about or anyone who's pitching anything, getting like, I it's so obnoxious to me. And I, this happened so many times you pitch investors. And it's like, we've spent, you know, years of our life on this thing and we're raising the second round of funding and they're like, Oh, you know, what, if you turned your freelance network into a, you know, an app that, helps people date each other, I was like, no, like I'm not here for that weird idea.
[01:24:46] I want to know the problem that you see that's causing you to have to make that leap to this weird idea. Like, can we unwind? So you think that the dating app idea is good because what's wrong. What do you see?
[01:24:59] Jess: Larger, larger, addressable market or something?
[01:25:02] Shane: Yeah. Or like, what are you seeing that I don't see that's wrong with my hypothesis.
[01:25:07] Not what's your new hypothesis. So I like the idea of going back to the investors that, you know, initially, maybe they bombed you out. Cause they, you know, they weren't on board and asking them to, to help you identify. You know, the concerns so that you can then decide, are these real concerns, are these assumptions, you know, is this like the version of like, Oh, it would be, it would be cowardly to run away.
[01:25:29]when that doesn't matter, if it means you win or, or are there things in there that can actually help you to come up with a better plan?
[01:25:35] Jess: I love it. Listen, this has been great. Anything else you want to put in here? You think we've covered it?
[01:25:41] Shane: I think we've covered it.
[01:25:43] Jess: I love it. Well, everybody, hopefully you've enjoyed these last six episodes.
[01:25:48] Thanks so much for listening. Bye now.