How to Create the Changes You Want by Rewiring Your Belief Systems with “Liminal Thinking” Methods

Liminal thinking: The pyramid of belief

I absolutely *love* the insights Dave shares in this video.

Especially the part about the self-sealing logic bubbles that can keep us from tapping into our greater potential. I recommend you watch Dave sketch out his process of “Liminal Thinking,” and then dig into my article titled, 3 Ways to Unlock Confidence with Peace of Mind because when you apply his brilliance to it, I think it’ll take you to new heights.

Oh, and, by the way, Dave references a book that he’s working on. Turns out it’s already published, and it’s awesome…

If you geek out like I do about reprogramming your unconsciously conditioned beliefs, then you’ll want to grab it too. Amazon has it.

Video Text:

Hi everybody, my name is Dave Gray and, I’ve been working on a book working title is liminal thinking.

I want to start with the ancient story of the blind men and the elephant. You know the story. So there’s an elephant… The story is that a king brings the blind man into the palace from all across the city. Then he has an elephant brought in and he asked the blind men to describe the elephant.

The blind men, of course, get into big argument.

You know one, uh, one of them’s touching the trunk another one’s touching the tail. Someone’s touching the side. And they can’t agree, right?

The, uh, one is saying that, uh, it’s a wall.

This guy over here is saying, oh, it’s a snake. It’s like a snake. This guy’s saying, no, it’s a rope.

And a, someone is touching the ear. He says, no, no, it’s a fan. So what’s the point of this story?

We’ve heard many times the point of the story, I mean, yeah, the elephant is reality, right?

And the point of the story is that we’re all blind. We’re all seeing aspects of reality and uh, we’re, we’re not able to agree because we are seeing each of us is seeing a truth, a piece of the truth.

In fact, all of these blind men are saying things that they believe are true, but none of them has the whole truth. The world elephant is this ancient, another ancient idea that we’re all. The whole world is the back of a giant elephant for the back of the elephant of the world.

Let’s just call this a reality. Okay?

The elephant is reality, but the elephant, the reality is also unknowable, but could also call this the unknowable. None of us has the whole truth. All of us have just a piece of it. And here’s how that works.

Each of us has our own experiences and observations of things that we see we experience in the world, right? We observe things, we experience things, and those experiences are going to be different. They’re going to be generally pretty local.

So we have these experiences and from those experiences we select, we notice those things that are relevant to us. We can’t notice everything. Our brains just aren’t powerful if we only noticed those things that are relevant. Usually the things that are relevant to our specific needs.

And then from those things that are relevant, we make assumptions.

And from those assumptions we draw conclusions.

And from those conclusions we form beliefs. And here we are. And by belief I mean everything you think you know. Everything you know… it’s a belief.

Here we are standing up at the top here. And uh, we easily get confused. We’re standing on top of this pyramid that we constructed, but actually we think we’re on the ground. So we have this imaginary ground and we think our beliefs, these things that we know ARE reality.

And even though I say, when I say the word elephant, we talk about the blind men and the elephant. And you have this picture of an elephant on your head. You don’t actually believe that you have an elephant in your head inside of your skull, do you? Um,

But we act as if we do. We act as if our idea of the elephant as if it was a snake or a fan or a wall or a rope. We act as if and, and operated as if that this is the only truth. That OUR truth is the only truth.

And so what’s up here… Now, down here, we have the unknowable, right up here, we have the obvious. What is the obvious? The obvious is that set of things that we never questioned because we believe them to be true. And we have a, we form kind of a bubble around this obviousness and we include those people who tend to agree. Usually there are people who tend to have similar experiences to us. Maybe they grew up in a similar environment. We create this bubble. And this bubble is kind of confirmed over and over again. By kind of self-sealing logic. That’s rarely tested.

This is my picture of self-sealing logic.

What do I mean by self sealing logic? All right. There was a study done about the Iraq war. The purpose of going into the Iraq war was stated by the administration was to stop, Saddam Hussein because he had weapons of mass destruction. And they did a study asking people, after the fact, even after it was very clear that there were no weapons of mass destruction, why people thought that we went?

And, lo and behold, sortta the way that people described it was things like, “well, if, if we went there, we must’ve had a reason. There must have been a reason.” And so often we’ll start with conclusions: “If we went there, there must’ve been a reason.”

And this has been demonstrated over and over and over, empirically, by researchers: that we start with the conclusions that we want to believe, the things that we want to believe. Supporting the people that are on our team, and so on. And then we work backward to construct collusions; to construct a rationale. And that rationale might take any number of forms. But in the end, it’s self sealing.

It’s self sealing logic.

Okay, so let’s say you grew up in a Christian country, here. Your experiences are mostly Christian. Now let’s take someone over here, who grew up in a Muslim country. Do you think their pyramid is going to be different? Well, certainly they’re going to have different experiences and observations. They’re going to have different needs. So they’re going to have different things that are going to be relevant out of that experience. Probably different assumptions. Different conclusions. Different beliefs.

They’re going to have the same problem though: They’re gonna have a different bubble. And a different set of obvious things. Then what happens? Now there’s a globalization and the Internet, facebook and that kind of thing. These worlds are colliding more and more. There’s different self-sealing logic, right? But this isn’t something that just some people do. This is something that all people do. We all do this. We have a set of beliefs that we cling to. We have self sealing logic that we use to defend it. There’s a guy named Chris Argyris who calls this a organizational defensive routines and I got the word self-sealing logic from him also. And he also has something that he calls “the ladder of inference.”

Which is what I’ve started drawing here on the pyramid. The ladder of inference. It’s how we get there.

So what happens? Well, you get this, these kinds of culture wars, right? People you know, saying that there’s a problem here and there, same here. And actually sometimes we even get real wars, right? How do we solve this problem? How do we move these, uh, pieces closer to each other? There’s only one way. And uh, first thing is how, what you want. We could talk a little bit about how this self-sealing logic works. What happens is, there’s two ways that we make sense of new ideas:

One is, let’s say, here’s a new idea, okay? Some new idea. Somebody is out here, we don’t know what pyramid they’re on, right? Maybe they have some shared experiences with us. Maybe not. We meet them in an airport or something, or on facebook. And they have a new idea. How do I make sense of a new idea?

Well, first thing is we look for internal coherence. Does it make sense with what we already know? So one is internal coherence. Does it fit with what we already believe? Well, if it fits with what we already believe… Well, will have a tendency to accept it.

Okay. What’s number two? Two: External validity. That means can we test it? If I take this idea and I try it, I tested it out. Will it work? Will it work? The problem is that if an idea doesn’t have internal coherence, if it’s, if it doesn’t already resonate with what you already know and what you already think is obvious, it’s going to bounce off of here. You never. It’s going to be immediately rejected. So we reject those things that do not have internal coherence. So you know, a lot of the ideas from over here are going to just bounce back. They’ll be rejected, and vice versa. And this is where we get escalating conflict.

Now, if you are going to reject an idea because it doesn’t have internal coherence, you’re probably not going to test it for external validity (even though an idea that’s coming from outside is the most likely to be the kind of idea that might lead you to expand or change your beliefs). You have this bubble of Self-Sealing logic that protects you from any new idea that’s gonna, that’s gonna challenge or change your beliefs.

So this is a sort of a protective bubble. Why?

Why? Because what’s in here feel safe. We like certainty. We like safety. What this bubble is actually doing is it’s outsourcing all the fog and fear that comes with actually dealing with reality, right? Reality is uncertain. In fact, even the ground is moving under our feet all the time. Very, very slowly as we know. I mean, I started out in the newspaper industry, um, you know, things, things changed, the world changes, you know, nothing stays certain forever. Nothing remains the same forever. Everything changes. Many of my friends who were in the newspaper industry has been there for 30 years knowing that the industry was in trouble, and still not leaving. Not doing anything about it. Except staying inside the bubble, continuing to use the self-sealing logic to kind of protect themselves from this uncertainty, just to feel safe.

Now the problem is we have the self-sealing logic protecting us from new ideas. Protecting our beliefs from any change. And the only things that we actually test for external validity are the things that actually make it through this. So we only test things for that validity if they already are internally coherent. You see how this kind of creates a self-sealing bubble that could be dangerous over the long term?

Now think of it this way: The fewer experiences you have, the narrower your pyramid. The narrower your set of experiences. Let’s say you grew up in the same, in one town, never left the town. Grew up with the same group of people. Your experiences, your relevant, your needs, your assumptions, your conclusions, your beliefs are going to be just probably equally narrow. You’re not going to be able to expand beyond your experiences very easily. So here you are. Now, this is going to feel… being in a space like this, the narrower it is, in some ways, the safer it feels. And the more of these things outside of here, feel scary and dark. However, you know, the wider a pyramid is the safer it is.

So the idea is that most of us, to most of us, this stuff that’s going on in here… This is all unconscious. This is kind of underwater. All this, all this construction of belief is not something we do consciously. It’s something we do unconsciously.

Now, how do we solve this problem? We’re arguing people arguing, problems, conflict, what have you. How do we solve this problem? Well, one way is to actually make this process conscious: To understand HOW you go about constructing your beliefs… To take a flashlight and peer down in there, and to start to walk down there. Start to deconstruct your own pyramid: “Why do I believe this? What are the conclusions that I made? What are the assumptions that I made? What are the things that were relevant? What are my needs and why were certain kinds of experiences things that I thought were relevant?”

Then we can start walking over here. And start walking up other people’s belief systems. Now to do that, we have to find a way to, number one, deconstruct our own beliefs. And number two, understand other people’s beliefs and how they came to them. It doesn’t mean we have to agree, but DOES mean we have to step outside of our bubble for a minute. So we’re going to walk. How do you do that?

Number one, you have to suspend judgment. Suspend your judge. Suspend disbelief. People act in ways and talk in ways that makes sense to them. So if you’re hearing someone and what they’re saying doesn’t make sense to them, you’re missing something. If it doesn’t make sense to you. Like that person’s crazy. That person’s insane. Here’s the thing, they are not crazy. It makes sense to them. If something makes sense to them, and it doesn’t make sense to you, then you’re missing something. Right? So you’ve got to start asking yourself,

“What would I have to believe in order to think that? What would I need to believe in order to think that?”

I’ve got more to say on this and I’ll say more over time. But this is the start of something that I’m calling liminal thinking. Liminal is a Latin word for threshold and liminal thinking is a skill that I think is going to be increasingly important for leaders as we move into the 20th century. Twenty first century, excuse me. And as more and more and more of these worlds start to collide. Liminal thinking is a way of connecting people and connecting ideas and expanding our ways of understanding the world so we can see better and communicate better. I have more to say in this and in the future, but I hope you enjoyed this and we’ll talk soon.