Crony Beliefs: Appendix

Evidence for epistemic cronyism

In my essay on crony beliefs, I was mostly concerned with constructing a plausibility argument. Here I'd like to document some of the evidence we have showing that the brain is in fact engaging in belief-cronyism.

Keep in mind that it's nearly impossible to prove that any particular belief is a crony. The whole point of crony beliefs is that they masquerade as ordinary merit beliefs. So rather than trying to pin down specific beliefs as cronies, we need to look for evidence of cronyism in general. A lot of this evidence will be statistical, similar to how astronomers can infer the existence of dark matter without, at present, being able to detect it directly.

In particular, there are two strands of research:

  1. Evidence that our brains don't function like meritocracies. Basically, we know how belief-systems should work in order to extract the maximum amount of information from their interactions with the world — but human brains fall short of that ideal in a number of ways. This kind of evidence includes: Belief perseverance. Belief polarization. The backfire effect. The puzzle of disagreement in light of Aumann's agreement theorem. Confirmation bias. More generally, the entire field of motivated reasoning.
  2. Evidence of outright cronyism. This includes:
    • The large number of cognitive biases that are blatantly self-serving. Here's Robert Trivers's summary of the literature: "At every single stage [of information processing] — from its biased arrival, to its biased encoding, to organizing it around false logic, to misremembering and then misrepresenting it to others — the mind continually acts to distort information flow in favor of the usual good goal of appearing better than one really is."
    • The game theory of strategic irrationality, as discussed by Thomas Schelling in his book The Strategy of Conflict. Specifically, it's the incentives created by mixed-motive scenarios (part competition, part cooperation) that make irrationality an occasionally-winning move.
    • The large literature that models the ego as the brain's press secretary (rather than its head executive).
    • The strong social emotions attending to our beliefs, including pride, shame, and anger.

I'd also like to appeal to folk evidence. All of 1984, for example, is a case study in epistemic cronyism. Or when Upton Sinclair says, "It is difficult to get a man to understand something, when his salary depends upon his not understanding it." And of course we've all caught whiffs of cronyism in our own minds — maybe not in the heat of an argument, but certainly in our more honest, reflective moments.

Crony beliefs vs. lies

There's no polite way to say this, so I'll just state it plainly: crony beliefs are deceptive. By definition, they exist to manipulate other people's impressions of us. "We deceive ourselves," as Robert Trivers says, "the better to deceive others."

The question then arises: Why go to the trouble of internalizing these crony fictions, when we could simply lie and achieve a similar result?

The answer is that crony beliefs and deliberate lies have different cost/benefit profiles. We might summarize this difference by saying that crony beliefs pollute our minds, whereas lies pollute our conversations. Depending on circumstances, then, it can make more sense either (A) to believe the truth, then construct deliberate lies as the need arises, or (B) to believe a crony fiction, but retain the luxury of speaking freely.

Here, then, are some of the factors that determine whether a lie or a crony belief will be more useful:

  • How habitual and complex is the deception? If it's a small, one-off thing, a lie typically does the trick. But lying is hard, so if it's a complex deception (with lots of interlocking ideas) — one that needs to be maintained over a long period of time, perhaps in front of multiple audiences — then it's probably easier just to internalize the belief. To co-opt Mark Twain: "If you tell the truth [i.e., say what you believe] you don't have to remember anything."
  • How carefully are we being scrutinized? How important is it not to get caught? Crony beliefs are very hard to detect, and even when they're detected, they're effectively impossible to prosecute. Lies, however, are relatively easier to detect, since people are always on the lookout for statements that don't match up. This suggests that our biggest, most important fictions will probably show up as crony beliefs instead of deliberate lies.
  • How much conflict is there between the belief in question and the rest of our belief-system? It's a lot more work to hire a crony belief when there's too much conflict with the other beliefs in our brain. Certainly it's possible, but it creates cognitive dissonance. It also creates a sharply inconsistent worldview that will make us look like we're lying, even if we really believe what we're saying. In these cases, it's often better to maintain a separate, deliberate lie rather than try to force an incompatible idea into our worldview.
  • How actionable is the belief? How dangerous would it be it to act on? If we aren't going to act on the information, we might as well internalize and believe it. For example, if our brains sense that there's social utility in the idea that "CEOs are overpaid," there's little reason not to admit such an idea into our worldview, since most of us aren't in a position to act on it. On the other hand, suppose you have a rival coworker who's always stabbing you in the back, but whom it's important to stay friendly with. In this case, it would be dangerous for you to believe that he's actually a nice guy. Much safer just to enact a few choice lies, e.g., smiling to his face.

Note that there's a smooth continuum between perfectly internalized crony beliefs and deliberate, bald-faced lies. Points along this continuum include: half-hearted beliefs, lip-service beliefs, exaggerations, half-truths, etc.

Note also that "believing" something doesn't necessarily mean we have to act on it. In fact we seem perfectly willing to ignore our crony beliefs when acting on them would be inconvenient. This is reminiscent of Robin Hanson's near mode/far mode dichotomy (crony beliefs being held in far mode, action-oriented merit beliefs in near mode). Or as Louis C.K. puts it: "I have a lot of beliefs, and I live by none of 'em.... I just like believing them. I like that part. They're my little believies; they make me feel good about who I am. But if they get in the way of a thing I want, I fucking do that."

Last updated November 2, 2016.