Davi Barker of The Black Sale joins us to discuss his book, Authoritarian Sociopathy. We talk about the various psychological experiments that have been done in the past to show people’s propensity to obey or disobey authority, and Barker proposes his own experiment that could take this research even further.
Watch this video on BitChute / DTube / YouTube or Download the mp4
TRANSCRIPT
DAVI BARKER (DB): Something about this field of study innately is everybody immediately asks the question, “Well, what would I do?” Right? And we all want to believe that we’re on the side that disobeys. But the statistics bear out that we probably aren’t. And so that creates a need to recognize when this is happening to yourself.
VOICEOVER: You’re listening to The Corbett Report.
JAMES CORBETT (JC): Hello, friends. James Corbett here at corbettreport.com in a conversation that’s being recorded on the 17th of April 2019.
Today we’re going to be talking to a new guest, a first-time guest here on the program, but someone who will be familiar to those who listen to Declare Your Independence with Ernest Hancock, which I’m a regular guest on.
You’ll probably already know of Davi Barker and his work. He’s affiliated with a number of projects and sites and books and other things—Bitcoin Not Bombs, and all sorts of other things.
But for the purposes of today, we’ll be directing people to one of his many websites, theblacksale.com. That’s Sale, S-A-L-E, and that link will be in the show notes.
But today we’re going to be talking specifically about one of the books he has written, Authoritarian Sociopathy, which is also the subject of a talk that he’s given and articles that he’s written.
You’ll find a number of things online under Davi Barker’s name with that title. But today, let’s talk about it specifically. Davi, thank you very much for joining us on the program today.
DAVI BARKER: Hey, yeah, thanks for having me.
JAMES CORBETT: Alright. So, we’re going to be talking about authoritarian sociopathy, and I think people will probably get the gist of where this is heading just from the title alone.
But let’s introduce the topic properly. In your Libertopia 2013 presentation on this topic that is available on YouTube, you start that presentation by saying, “The reason to study psychology, if you’re interested in libertarianism or anarchy, is because, factually speaking, your enemy only exists in the mind.”
A very important point—and one that may be over the heads of some of the listeners. Let’s spell that out for people. What do you mean by that?
DB: Yeah. It’s almost a technicality, but it’s almost the whole cake, too. If you look at the standard definition of “government” in Western statecraft, it’s generally something along the lines of “that group which claims a monopoly on the legitimate use of force in a given land mass,” et cetera, et cetera. Now, all of those things are factual claims. A land mass is a factual thing. Force is a factual thing. But legitimate use—that’s something that only takes place in the mind.
So, a lot of what this book is about, and I think what a lot of libertarianism is about—and anarchy, for that matter, as well—is defining or redefining or reclaiming what we view as legitimate.
JC: Exactly right. So, what is legitimate power? How do we decide this? There are a number of psychological experiments over the years that have been conducted along these lines.
My audience is going to be familiar with the Milgram experiment and the Stanford Prison experiment, because I’ve talked about [them] a number of times.
If people aren’t familiar, just type Milgram or Stanford Prison into my search bar. You will find the previous times I’ve talked about [them]. But I’m sure a lot of people are familiar with those experiments.
One of the things you do in Authoritarian Sociopathy is detail a couple of more recent experiments that people might not know about—because they obviously haven’t received the same amount of publicity.
One thing to note about these experiments: they’re not as dramatic. They’re not as fireworks-inducing as Stanford or Milgram because of the ethical guideline changes that have resulted from experiments like the Milgram experiment or Stanford Prison experiment. Let’s address that for a moment.
DB: This is actually something interesting. When I first pitched this at PorcFest, I say, if you watch that YouTube video, the APA [American Psychological Association] changed the guidelines and made it illegal.
Well, in the transition from this pitch—which was thrown together in forty-eight hours—to publishing the first version of the first edition of the book, I learned that [the APA is] not even a government agency. They’re an association. They’re like a club of psychologists and psychiatrists and other academics. And they write guidelines that affect public funding. But their guidelines aren’t law. So, that’s an important distinction.
Most academics, most university professors [who] are doing this kind of research are going to want to abide by their guidelines. So all of the experiments since then are toned way down. They’re not dramatic. They’re trying to avoid traumatizing the subjects. And so the results are . . .
JC: It’s a no-no to make people believe they’re killing people . . .
DB: Yeah, I guess. I mean, they pushed the button themselves, so I think they traumatized themselves.
I don’t know of anyone else calling it “authoritarian sociopathy.” I’m throwing a net around these experiments and I’m saying: These are all measuring the same thing that the Stanford Prison experiment and the Milgram experiment are testing. But they’re all testing it in different ways.
It’s giving us a broader picture—giving us more data. I’m saying, “Let’s look at all the data. And then let’s say: What do we want? What data do we not have that we want to seek out?”
JC: Right. These experiments that we’re talking about—Stanford and Milgram and the other ones we’re going to be talking about—all come under that umbrella. They’re all talking about power and how power manifests in the mind of people in these experiments.
DB: Right.
JC: Let’s define terms first, perhaps. What is authoritarian? What is sociopathy?
DB: This is, again, difficult, because every researcher is going to define these things differently. Some of the experiments are going to be about corporate power, and some of them are going to be about political power.
I like to think of it as the desire to assert your will over someone else’s will—to literally usurp their authority over themselves. If that’s your goal—whatever it is, for whatever reason—that’s authoritarian.
What I’m suggesting is that if you look at these experiments, that position, what Philip Zimbardo called “situational power,” induces sociopathic behavior in almost everyone—in people who wouldn’t exhibit it normally.
JC: Right. And Zimbardo talked about “The Lucifer Effect,” as he termed it, and talked about how that manifested in very real life in places like Abu Ghraib. Some very interesting talks on that subject.
Let’s get into some of the other experiments. There’s one by Dana Carney, which talked about power relations.
DB: Sure, you want to start there?
JC: Yeah, let’s talk about that.
DB: Power and lying is what that one was about.
She [Dana Carney] is from the University of Berkeley. And, if I remember correctly, it had something to do with EKGs. Or they had them hooked up to something. They were essentially able to show that if you take a body of subjects and you divide them randomly—”You’re the power group, and you’re the subordinate group”—and then you induce them to lie, the people who are just randomly given power don’t have the same physiological and autonomical responses that you would expect a lie detector to pick up.
You can show a statistically significant difference in the way a subordinate class responds to lie-detecting techniques and the way a power class does. And in [the latter] case, it actually induces pleasurable responses. Normally it causes tapping or anxiety or you exhibit stress when you’re lying. [The people in the power group] don’t have those symptoms. Or at least they have them statistically less in a randomized sampling.
JC: Right. Yes, it’s important to stress the randomization of this. This isn’t a thing where people with power and people without power go into the experiment. No, this is an experiment where they sort people randomly into groups of executives and workers in a corporate setting.
DB: Yeah. In this one, they put on a one-hour mock business, where one person is the boss, and the subordinates have to follow orders. Just that is enough to get the juices running in the brain so that they respond to lie detectors differently. That’s amazing.
JC: And I think one of the key takeaways [is that] they were measuring stress hormones, cortisol and other factors in this. They found that not only did the people in the high power positions who lied not exhibit any of the natural physiological symptoms that we associate with lying—like what the lower status people exhibited when they were lying—but that the higher status people actually took pleasure in the lie.
DC: Yeah. That was self-reported. That wasn’t picked up by the lie detector. That was done in an emotional survey after the experiment. The people in the power group reported feeling happy about lying, and the people in the subordinate group reported feeling stressed or anxious about lying.
JC: Right. Yes. And for people who are rightfully skeptical of lie detector technology in general, I think this had more to do with, as you say, EKGs. They were measuring cortisol. They were measuring . . .
DB: I think there were four distinct measurements.
* There was watching the video recording and paying attention to ticks and movements that people associate with bodily stress. [Watching behavior,] because that’s autonomic.
* There was the actual lie detector mechanism.
* There was self-reporting.
* There was reaction time and cognitive impairment. They scored poorly on video game-style responses to reaction time. If you lie, you’re stressed out, and so your body is not as good at reaction time, and they don’t diminish that way.
JC: Right. Okay, so what do we take away from that experiment? What does that teach us?
DB: Man, what does that teach us? It teaches us that even if you send an honest person to [political] office, their incentives in that office are to lie. That they’re going to be emotionally, chemically rewarded for using their power in deceptive ways—even if they wouldn’t have before, even if they never would have before.
JC: Exactly. And that’s why the randomization is important. This isn’t something that’s inherent in certain people. This is anyone who is randomly selected for a position of authority and exhibits these particular traits. That’s a fascinating insight.
DB: Right.
JC: Let’s move on to another experiment that you talk about in the book, one conducted by Gerben A. van Cleef and some associates. What was this experiment about?
DB: Hang on a second. Van Cleef is power and compassion. This is one of my favorites, because I think it’s the root of the matter.
This is a storyteller and listener scenario. Subjects are divided into power and non-power. Then they’re paired randomly, where one is told to tell a story and the other is told to listen [to the story] while they’re hooked up to the same bodily stress test things.
If you were the storyteller, for example, and I were the listener, we would both be having our emotional state monitored. You would be instructed to tell me a story of personal suffering. Now, if we coupled randomly as two low-power individuals, then I would experience measurable emotional response in relative order right after you do. My readout is going to match yours a little bit.
High-powered people don’t exhibit that trait as much. One of the interesting things about this study was that . . . they all accurately identified the emotions, [but] after the fact, in the survey, they reported not caring.
So, that’s something that happens. It’s a “not caring about the little people” or “not remembering where you came from” phenomenon, where, once you become in power, you begin to think of yourself as more important, and so their suffering becomes less important.
But the other thing was, if you’re a low-power storyteller and I’m a high-power listener, the results [of my not caring] on your suffering are, on average, throughout the experiment, greater than [if you had] a low-power listener. So, my callousness has a measurable effect on the increasing of your suffering.
JC: Fascinating stuff. What do you think is the takeaway from this?
DB: Oh, man. Well, compassion is one of those down-deep root parts of the human condition. It’s also the motivating factor of a lot of people who put their energy toward politics. Even if they’re misguided or [. . .] maybe they’re being tricked, compassion is one of those levers on the human heart that [high-power] people use to pull [low-power people] into mass movements of all kinds.
JC: I feel your pain . . .
DB: Right! So, if you give power to compassion, it inverts. The person you put to represent you in a position of power now doesn’t feel the compassion you elected them to express.
JC: And again, it should be noted, it’s not just about the types of people who are attracted to positions of power. It’s anyone who’s put in that position of power.
DB: Yeah, I believe this experiment tested both. [I think] there was one version of it where it was random and there was another version of it where the subjects were allowed to self-identify. Or there was some way that they measurably put them into two groups of people who were high-power in their life. I’m not sure.
JC: Do you remember if the results were different depending on that?
DB: I don’t remember off the top of my head. A lot of what this book is . . . Just so you guys know, this book is very small. It’s almost a reading list. It’s almost like: Here is a primer of things you should read by people smarter than me.
JC: Well, we’re getting to the meat and potatoes. This is really just the set-up to the point of this [interview].
But [there is] one more experiment that I think we should talk about. That’s Joris Lammers from Tilburg University and Adam Galinsky of the Kellogg School of Management. They conducted a battery of experiments designed to test how having a sense of power influences a person’s moral standards. Let’s talk about their experiment.
DB: This was the hypocrisy study. It is for me the most interesting of all of them. It ranks [up there] with Milgram for me as far as its implications.
It’s why social sciences are soft sciences. The thing with all of these studies is that a lot of your data is the product of high numbers and low reliability. Your data is self-reported. These experiments are almost entirely self-reported results. We’re talking about your sense of moral offendedness.
All of these studies are designed to get at the discrepancy between what a person’s degree of tolerance of others or condemnation toward others is . . . versus the amount of allowances that they give themselves for those same moral infractions.
The studies covered everything from stealing a bicycle to finding a bag of money to cheating on lottery numbers. But this is not really happening. This is not a science. This is not a Stanford Prison experiment. This is people self-reporting in surveys after participating in a power dynamic.
But the result is staggering for me. When I read this, I was amazed. What they did in their final experiment was they separated the power group into two categories. [In one of the categories,] there are people self-identifying as “I’m in the power group.” [In the other category,] people are self-identifying as “I’m in the subordinate group.”
They said to the ones who self-identified as powerful people: “Tell me an example of a time in your life you were in a position of power and it was legitimate.”
They said to the other group: “Tell me a story about a time in your life when you were in a position of power that you felt was illegitimate.”
The second group are people who also self-selected as the power group, but they are talking about a time when they felt their power was “illegitimate.” They become self-critical and tolerant of others, [whereas] the “legitimate” power group is tolerant of themselves—lenient to themselves—and breaks rules . . . but is draconian toward others.
That’s the first scientific study I’ve ever seen that confirms the hypothesis: It’s all about the sense of legitimacy.
JC: Right. Elaborate on that. Why is that important?
DB: Because it’s in their head too. Their sense of legitimacy is a social phenomenon. It’s based on the people they interact with in their lives and whether they confirm that sense of legitimacy by social means. We do this all the time with each other, where we confirm each other’s behaviors and attitudes with “Amen” or whatever the expression is, like “Truth, brother.”
And if you undermine a person in power’s sense of legitimacy, then they become self-critical. It’s almost like a silver bullet. It’s like, if you could expand on this? I don’t know how to effectively apply it.
JC: I think one of the interesting phrases that comes out of this the researchers dubbed “hypercrisy,” as opposed to “hypocrisy.”
DB: Hypercrisy. Right.
JC: “Hypercrisy” is when you’re more critical because you believe your power to be illegitimate. You’re actually more critical of yourself.
DB: You’re watching yourself.
JC: The delegitimization of the supposed legitimate authority is, in a sense, the silver bullet that we’re looking for here, because it actually makes the people who are in those positions question themselves even harder than the average person would.
DB: Yeah. And if you look at the world that way, then the entire election process becomes this dance. It’s like a rain dance for legitimacy to rain down from the ballot box. That’s literally what it is. It’s this ritual that this particular crop of primates has come up with to invest authority and whatever crown or throne they’ve come up with.
JC: The people have spoken. It’s the will of the people that I’m here. And because of that, people are mentally enslaved in their minds. This is The Most Dangerous Superstition that Larken Rose talks about. This is what it boils down to. There it is, in an experiment form.
And again, whatever people make of the self-reporting aspects, at any rate, there is something there. And this goes back to what I’ve talked about many times with Laughing At Tyrants and other things like this. Just laughing at the stupidity of it is something productive, is something positive that we can do that actually dismantles their perceived legitimacy. So, I think it’s an important aspect and shouldn’t be neglected.
DB: The other thing I’m saying is: If this is one study that had these dramatically significant results, wouldn’t you want to go and repeat all of the other studies and see: Does illegitimate power have a reversing effect on lying? Does it have a reversing effect on empathy?
JC: Yeah. There was an element of that to the Milgram experiment, because, as people might know, they’d conducted it many times in many different iterations. Some of them were conducted on campus with the white lab coats, and others were conducted in seedy office buildings downtown, detached from the academic setting. People were more likely to say “no” when it was in that [seedy] setting rather than in prestigious hallowed halls of academia.
So, again, it’s the idea: Is this legitimate power that’s giving me these instructions? Or is it illegitimate? That’s so much of what this comes back down to.
DB: Yeah.
JC: There is so much detail in there, people really should read the book to [learn] about these different experiments and how they relate.
But this is really just the setup to your kicker, which is your proposal for some sort of experiment that could be conducted in some form—you’ve got some ideas of how it can be done—but some form that we can push this a little bit further and really see what’s making things tick here and [ask] are there independent variables that can be tweaked to make people more or less compliant with authority. And things like this.
Let’s hear about your idea for an experiment that can and should be conducted.
DB: Sure. Well, this is where it gets a little interactive. The version of the book that’s on Amazon right now and the one that I have in stock is the fifth or sixth edition. And that’s because every time I do a talk, every time I do an interview, people send me [requests to] look at this experiment . . . look at this story . . . look at this thing going on. There’s a feedback mechanism that’s happening.
I’m inviting listeners [to this show] to do that. Send me your experiments, because there are going to be future editions of this book. Other than the one currently published, I have other designs that I want to write up.
In the current edition, there’s an experiment written up that I titled “Police Brutality: An Experimental View.”
The hypothesis is that people will intervene in an instance of illegitimate aggressive force more often if they don’t think the power is legitimate. That there’s a measurable difference between the way a person responds to aggression and the way people respond to aggression with a badge.
This experiment was devised fairly quickly at first, but now I’ve had a lot of experts look at it and find the problems. We’ve tweaked it and we’ve added to it.
We’ve got a contrived scenario where a person is witness to another person being brutalized as a pre-recording, just like Milgram’s was. Milgram’s recordings were technically pre-recorded for consistency.
Half the subjects see a cop beating up a guy and half the subjects see a civilian beating up the guy. The question is: When, if ever, do they intervene?
We can walk through it step by step, if you want. You want to go through the whole thing?
JC: Let’s do it. Let’s start with where you’re thinking this would take place and in what way you’re recruiting people for this.
DB: The idea was to do it in a shopping mall or someplace where there’s rentable space. [You approach someone and you say,] “Do you want to take a survey? Do you want to watch a movie trailer? Do you want to participate in some whatever?” [Asking these questions] gives you an opportunity to admit that it’s a social experiment and potentially even secure a waiver of some kind. You’re tricking them into watching a violent video they think is real, so you want at least some degree of consent. This is a sticky one, because you can’t give complete informed consent on a social experiment. But the APA hasn’t solved that problem either, so . . . .
I think the best you can do is brace them for what they’re in for. Account for the worst.
JC: I mean, you could say something like, “Oh, do you want to see a movie trailer that involves violence,” right?
DB: Yeah, that was the proposal that I landed on in this edition [of the book]. If you could get them to say that they were willing to watch a movie trailer that had realistic violence, then tricking them into watching a security feed that depicted violence was fair game, right?
The other standard that came out of the discussion was someone suggested that anything that would be readily available on evening news should be fair game, because that’s culturally what we expect.
JC: Yeah. And that envelope has been pushed in recent years, hasn’t it? So there you go.
DB: Yeah. I mean, that’s the other thing. There’s an element of this book that is about renegotiating all of these ethical questions. I’m open to that feedback, too.
Nobody’s run one of these simulations as far as I know, although I invite them to.
JC: Right. So, [by] one pretense or another, you get the person in the room. They think they’re going to watch the trailer or whatever it is.
DB: Right.
JC: I’ll let you set it up.
DB: Let’s talk some architecture here for a second. You approach them with a clipboard in the mall. You say, “Follow me. I’m going to show you the video in this back room.”
You walk them down a hallway, where it has to be obvious that there’s a video camera. And maybe there’s something in the room that’s recognizable—like a carpet or a potted plant or something that they’ll reference when they see the feed later.
Then you take them into a room. You say, “Please fill out this questionnaire and wait here while we get the next group together.”
And now they’re sitting in a room where they’re looking at a screen that looks like the security feed of the hall they just came from. Does that make sense?
[Corbett nods “yes.”]
So they should, just by context, be tricked into thinking the video feed is the hall they just came from, because of the rug or because of the potted plant.
JC: But it is a pre-recorded video.
DB: It is a pre-recorded video that plays after they sign the waiver. So then you have a personality profile. You got them to sit there with a clipboard and fill out what they think is getting them a $10 Walmart gift card or whatever, which I think we should give them.
You can get name. You can get marital status. You can get whatever kind of socially interesting rubric you want on your subjects’ profiles. You can ask them even interesting questions.
I proposed asking them whether they were publicly schooled or homeschooled, if that made a difference, or if they’d ever heard of the Milgram experiment. Just “‘Yes’ or ‘no,’ have you ever heard of the Milgram experiment?” If you could show that that alone made a significant difference, that’s cool.
Then the surveyor comes out with another participant, and this is a confederate of the experiment. They give them their clipboard and thank them and give them their [gift] card, and they walk across the room, and then they go into the hallway.
The moment they go into the hallway, the video becomes a recording of them being assaulted by someone for unclear reasons. I propose that this should be done with stunt choreographers to make sure it is as picture-perfect as possible.
You don’t want to contaminate the samples. You want a clear subject difference, where one variable is that he’s a police officer assaulting someone and one variable is that he’s a civilian assaulting someone.
The question is: When does the subject, if ever, open the door? Opening the door to go into the hallway is “intervening.”
I don’t care what they were going to do [once they’ve opened the door]. I don’t care if it was [to] call the police. I don’t care if it was [to] take out a camera. I don’t care if they imagined they were going to physically remove the guy.
At that point, they took physical action to leave one scene and enter another. I’m calling that an “intervention.”
Then you interview them. You want to pick their brain and say, “What were you planning? What did you think? How did you react?” Collect as much immediate data as you can.
Someone proposed—and I think it’s a good idea, maybe from Dr. Stephanie Murphy—that you do an exit interview that’s off the record with a counselor you have on board.
The counselor would say, “If this was too traumatic, if you feel like you want to talk more, here’s a professional who deals with trauma.” You do this in case there’s any sort of triggering that happens with anybody. Because you don’t know. You don’t know what you . . .
JC: Someone may have been a victim of police brutality or something. So, is the hypothesis that people would be more likely to respond when there is not a badge involved?
DB: Let me read it. The hypothesis is just a paragraph in here.
There are three.
Hypothesis #1: “Given the opportunity, a significant portion of the general population will not intervene in a clear incident of unprovoked police brutality.
Hypothesis #2: “There will be a statistically significant difference between the percentage of people who will intervene in an incident of police brutality and people who will intervene in an incident of brutality by someone in civilian clothes.”
Hypothesis #3: “Demographic information, personality, socioeconomic lifestyle, or other information can be discovered which correlates with high rates of intervention in an incident of police brutality, which will allow us to begin to create a psychological profile of those willing to intervene against corrupt authority.”
JC: I’m on board. I think it sounds like a cool experiment. I think it would definitely teach us something. It’s been, what, six, seven, eight years since you came up with this. So, you’ve made a lot of . . .
DB: Well, yeah, the pitch was the experiment itself. It turned out to be an expensive price tag. A production company that actually does this kind of thing quoted me something like $100,000. It’s in the book.
I’ve also had psychology students reach out to me and say that they don’t even think it violates the APA guidelines and they can do it in their universities. If that’s the case, yeah, take it and run with it. Open source all of it. I encourage people to do that and devise their own experiments and start networking the data.
JC: Yeah. It needs to be thought out and all the pieces need to be in place. I would recommend people read the book to see the way that you set it up and the things that you talk about.
But, yeah, we definitely need something like this. Let’s elaborate on why, because some people might be thinking, “Who cares? We already know.”
DB: Well, worst-case scenario, the spookiest reason why is because if power controls where the money in psychology research goes, power is not going to allow itself to be researched. So, because of this very phenomenon of authoritarian sociopathy, leaving it in the hands of people in power to research is a contradiction. It’s a bias. It might even invalidate the data. I don’t know.
JC: . . . which actually ties back to an interview I had just a couple months ago with Dr. Bruce Levine, where we were talking about resisting illegitimate authority, specifically. He was talking about the way that the psychological profession has been structured around these perceived legitimate powers and authorities that he himself, as a practicing psychiatrist, has been fighting against most of his career.
DB: Yeah. That happens obviously because it is a power structure and because this is not a feature of the Republican Party or the Catholic Church or the . . . whatever. It’s a feature of human beings, the primate. Anytime there’s more than twenty of them in a room, they’re going to start building these little hierarchies and they’re going to start having these sorts of effects on their psychology. So, be aware of it.
JC: Yeah, and I think people should reflect on the fact just how far the Milgram experiment has penetrated into the popular consciousness. A lot of people will have heard about that, even people who aren’t particularly interested in psychology or authoritarianism or things like that probably at least have heard of this experiment.
And that is important because it does create at least the space for the conversation—to open up the conversation about power and how it operates and what does it mean and are you willing to obey?
I think that’s where a lot of this ultimately comes back to for me. I mean, I’m interested in these types of experiments in and of themselves. But, I think one of the meta effects of these types of things is to at least get people to step back and question themselves, and to think and to reflect:
Does this affect me? How so? What would I do in that situation? I think it is something that we need to open the conversation to.
DB: . . . and which group am I in? Yeah, absolutely. Something about this field of study, innately, is [that] everybody immediately asks the question, “Well, what would I do?”
We all want to believe that we’re on the side that disobeys. But the statistics bear out that we probably aren’t. That creates a need. A need to recognize when this is happening to yourself, I guess. When are you just going along to get along? And do you have to?
JC: Yes. And if the answer is no, then what are you doing? Yeah. And it puts you in that mind frame. I mean, if I were ever in a Milgram experiment, I would know, “Oh, this is a Milgram experiment.” Because I know about the experiment.
DB: You’d know immediately.
JC: Perhaps that influences the way I would actually act in a real-life situation?
DB: I bet it would. But that’s interesting data, too. If you could demonstrate the kind of outreach that was statistically effective at actually changing people’s behavior in a controlled environment, that’s powerful.
JC: Yeah. So, once this experiment that you’re proposing here has been conducted thousands of times with millions of participants all around the globe, and then somehow we can get some of those same participants into a second study to see how they would react . . . it’s a large project, huh?
DB: Part of it is, and this is going to be in successive editions. I don’t think that it’s worth the data to price tag ratio. I think what I’ve designed at the moment is an expensive way to get this data, and I think there are less expensive ways to do it. And so I’ve been devising that, and I’ve been getting suggestions about that, too.
Also, technology is more advanced all the time. It’s plausible we could use the digital infrastructure that’s around us already and we don’t need to rent the mall and we don’t need to dress up the cop. It’s one of the things I said earlier. It’s about having lots of data that’s imprecise. That’s what you want. You want 2,000 people to give you imprecise data so you can see the big bell curves. And the way this is designed right now, it’s calling for something like 200 participants, and that’s expensive. So, it’s low volume of data, but maybe too precise.
JC: Yep, yep, yep. But since this is already an experiment that involves simulated elements to it, how far can you take that? And does it have to be physically in this space in the mall?
DB: Well, I don’t think so at all. We’ve been talking about devising simulations online pretty much since we first started talking about this.
I’ve been thinking recently that the way to do it is [with] video games, because video games cause the player to make in-world calculations very much like real-world calculations that are ethical in nature—or can be orchestrated to be.
The other thing is: This is not my area of expertise. I am in a way asking other people to take this up, even in an amateur way.
One of the things I’d like to devise is the types of things that you could do from your home or even among friends. If you could simulate a measured result in a safe way over dinner, that’d be really powerful to be able to do with friends or to do at parties or something like that.
JC: There are many different ways that we could take something like this and run with it. I’m excited to see the different ideas that Corbett Report members will have in the comment section here.
Also, I’m assuming people will probably want to get in touch with you and share some of their ideas. What’s the best way for people to get in contact with you?
DB: Davi [at] bitcoinnotbombs.com is the easiest email to get me.
JC: Alright, and I’ll link that in the Show Notes so people can see that. I think we’re going to leave it there for today on this topic. It’s such a huge topic, and there’s so many different bits and rabbit holes we could get into.
But is there anything else you’d like to leave people with on this topic before we leave it here?
DB: Well, it’s meta, because being willing to question the authority of the man in the lab coat de facto means you can question the authority of the university that employs him, which de facto means you can question whatever authority it is that makes you think you’re not an amateur psychologist.
JC: In fact, when you said that word “amateur,” I wanted to say, “You know, that’s not necessarily a pejorative.”
DB: I don’t think of it as a pejorative. I called it “a renegade psychological experiment,” because we are all independently, naturally scientists in some sense—synthesizing the evidence of our senses and applying it. And we all have to do that at some level. So, let’s really do it.
JC: Yeah, exactly. Don’t take other people’s word for it. But get good ideas from other people and apply them when you can.
Lots of things to think about. You’ve talked about this in numerous lectures, and there are different permutations of this available online. I’ll direct people to the book, because it’ll probably be the one-stop shop for this. But there are lots of different threads here that people can pick up and examine.
So, we’re going to leave it there for today. Davi Barker, thank you so much for joining us today. I hope we have the chance to talk again in the future.
DB: Absolutely. I had a great time.








EMPATHY
The more similar an organism to myself the more I will empathize; I have an easier time eating beef than monkey meat. If somebody that is very similar to me suffers, I have an easy time putting myself in their shoes. Because it could happen to me it is important that I pay close attention. Trump will never be in my shoes so the suffering of a dog is more meaningful to him.
I was surprised to come across a poll at Derrick Broze’s website asking if he should run for mayor of Houston. A sliver of his readership clicked a majority vote in support of his running; a combined “No” and “Hell No, You Sell Out!” (my vote) still falls short of those in favor, which I find incredible. I don’t think he’d stand a chance anyway, but in an imaginary scenario, I do believe the disturbing tendencies noted in Davi Barker’s book would soon manifest.
Question: in the experiment where the two groups of authority were asked to describe a previously held position of either “legitimate” or “illegitimate” authority, were any of those situations described? Unless any of the subjects were already of an anarchist bent, it would be very difficult to recall a time of illegitimate authority. Off the top of my head, I can’t think of such a position, except “Oh, yeah, that time I was head of the neighborhood drug ring – I feel kinda bad about smashing Vito’s kneecaps.”
Great conversation. Funny thing happened yesterday, I attended a superior court session with a friend for a fine reduction motion and after the judge seated herself she made a declaration as to possible conflicts of interest as she had worked for the prosecutor’s office previously and her husband was currently working for that same office and if anyone had any protestations this was the time to make a motion. This was very interesting as just for a smirking wink wink moment she had to prostrate herself unto an authority greater than herself, the robe had been lifted, and for a fleeting second the overwhelming contextual truth paved a path for consideration of empathy for the human element in the meat-grinder of the parasitic justice system. People say that it is always best to be first on the docket while the judge is in reasonable spirits and ot turned out great for my friend, fines reduced from 8k to 2k with the mark of the crown’s agent and 19 pages of documents pre-completed. It’s not making money it’s losing less when exposed to government entities.
“It is difficult for men in high office to avoid the malady of self-delusion. They are always surrounded by worshipers. They are constantly, and for the most part sincerely, assured of their greatness. They live in an artificial atmosphere of adulation and exaltation which sooner or later impairs their judgment. They are in grave danger of becoming careless and arrogant.”
–President Calvin Coolidge
I talked about this proposed experiment back in 2017 on AM talk radio. I hope they do it. In my version the cop or civilian was beating up an old lady, with the hypothesis that people would not intervene on the old ladies behalf when the cop is doing the beating, but would with the civilian.
I’d like to suggest that a cost effective method of observing some of these sociopathic behaviors in a natural environment would be simply to observe the daily operations at your local Department of Motor Vehicles, where petty officials brandish their “authority” and the victims willingly line up and watch others be robbed and bureaucratically abused. In speaking with a security guard at the DMV recently, he spoke of the absoulute hell it was to work there, as the abuse that emanated from behind the governmental desk was not only given to the public, but also filtered down the differing strata of employees, and the rage that was often generated thereby led to violence, hence the need for a security guard. ( And a poster on the wall warning of the consequences of threatening a public employee.)
This was such a fascinating conversation. Thanks for the food for thought. I have some thoughts about the terms discussed here. Steven Porges, a neuroscientists, distinguishes empathy from compassion. He notes that neurologically, when feeling or expressing empathy one is using the sympathetic nervous system (fight or flight). He defines compassion by the use of the parasympathetic nervous system (rest and digest). I wonder if there is a way to define neurologically (which neurological state sympathetic or parasympathetic) sociopathic behavior utilizes? If so, is there a measurable differnece of those in power sympathetically and therefore empathically responding from those parasympathetically and therefore compassionately responding? Ultimately is there a measurable difference between those in power responding sociopathically from those compassionately.. since compassion would not elicit an autonomic behavioral response? Do people who act with compassion seek positions of power? Do those who neurologically present more sociopathically seek positions of power over those empathetic or compassionate ones? So many questions can be derived from this concept. Excited to hear more.
bharani
I agree!
I have already watched this twice.
There are a lot of important insights.
I plan to review the video for a third time.
This is a 5 STAR interview!
*****
I too am excited to hear more. I offer my little city as a fertile laboratory for the two needed types in leadership that Kinetics suggests for study.
Can you imagine the absurdity of a technician approaching a billionaire Zionist bank owner and suggesting “he be part of a study.” of course he will ask what are you studying? You know what he is going to say after a pregnant phase…”get T.. F… out!”
I have a long list of one type and a much shorter list of the other type if anyone cares to pursue it.