Here’s an observation from Matt Yglesias’ blog post about misinformation:
A normal person can tell you lots of factual information about his life, his work, his neighborhood, and his hobbies but very little about the FDA clinical trial process or the moon landing. But do you know who knows a ton about the moon landing? Crazy people who think it’s fake. They don’t have crank opinions because they are misinformed, they have tons and tons of moon-related factual information because they’re cranks. If you can remember the number of the Kennedy administration executive order about reducing troop levels in Vietnam, then you’re probably a crank — that EO plays a big role in Kennedy-related conspiracy theories, so it’s conspiracy theorists who know all the details.
A major misconception about “misinformation” is that it is a literal lack of factual information. For example, Americans with more scientific literacy are slightly less likely to accept the scientific consensus on climate change.1 The mechanism is that more educated individuals have more polarised beliefs, and climate change is a highly polarising topic in America. More knowledge can either lead you to form correct beliefs, or can be used as ammunition to defend outrageous positions. The world contains lots of people like Alex Epstein, who would mop the floor with me in any climate science exam, and who thinks we need to vastly increase our fossil fuel use.
Thus, a heuristic: Where your factual information outpaces your natural curiosity, you’re probably a crank. For example, I know that geothermal was excluded from section 390 of the Energy Policy Act of 2005, which may be killing progress on geothermal because new developments must pass strenuous environmental review. I have not hitherto demonstrated curiosity about geothermal energy in general. Therefore, on this issue, I am probably a crank.
I don’t know if he even remembers saying it, but one of my friend’s pronouncements has stuck in my head for some time: “Either we have non-partisan ignoramuses, or educated ideologues.” Then he paused for a moment, “And if I had to choose, I’d go for the educated ideologues.”
Now, that’s too pessimistic for me. I doubt we would see the same pattern with topics that aren’t already politically controversial. But, to take a broader view, it’s tempting to have a linear model of intellectual development, in which learning useful facts is consistently good. Subconsciously, I think this is what motivates me to read more books. In reality, if you are anything like most people, your thinking on certain issues (politics, religion, etc.) will be reliably stupid. You can easily waste your life going down intellectual dead-ends, and no one will stop you.
This is depressing to admit, but it does shed light on some of the misinformation results. Understanding does not consist in having knowledgeable people fill up your brain with true information. To understand is to traverse choppy waters while not getting sucked in by manias, fads, and bias. If you proceed slowly, but don’t get sucked in, you will still be doing better than most.
This is my interpretation of the slightly negative regression coefficient in Table 1. This isn’t statistically significant and I don’t necessarily expect it to replicate; the fact that it’s not overwhelmingly positive is what’s important!