OK, I promise, this will be the last Stasi post ever.
tl;dr: This post is too long. Don’t read it.
Let me be sincere for a moment before lapsing into the sort of counterproductive sarcasm that you all love or hate so much.
Cass Sunstein, like Richard Epstein, is a prolific law professor with public policy interests. Sunstein and Epstein are well connected, they’re plugged in, they’re willing to write about just about anything, they have the ear of powerful government officials, and they seem to have strong feelings that if people just followed their recommendations, the world would be a better place.
OK, to be fair, just about all of us feel that the world would be a better place if more people listened to us. I know I feel that way.
So what’s my beef? My beef with the Sunstein/Epstein twins is that they really really really don’t want to like to admit their mistakes, even when said mistakes are in everyone’s face.
This bothers me. As a statistician, I’m attuned to uncertainty, and as a social scientist and student of social science, I’m strongly aware of how much we learn from anomalies. I’m with Popper and Lakatos and Jaynes on this one: Build strong models, make big assumptions, issue strong statements, and then when (inevitably) you’re proven wrong, when you’re embarrassed in front of the world, own your errors and figure out what went wrong in your reasoning. Science is self-correcting—but only if we self-correct.
Richard Epstein is a buffoon; the less said about him, the better. Sunstein bothers me more, for the same reason that David Brooks bothers me: on one hand, they make strong and often obnoxious statements and then later don’t go back and admit they were wrong; on the other, this is entirely in contrast to their stated doctrines (Sunstein’s shtick about cognitive illusions; Brooks’s shtick about humility).
I can’t imagine I’m going to change Sunstein’s beliefs or his behavior. He seems to have a combination of the advocate’s approach of never acknowledging a counterargument, along with the journalist’s attitude that yesterday is gone and we should look toward tomorrow. And I’ve already tried and failed with Brooks (including lots of polite emails, once upon a time). So you can spare me the honey-works-better-than-vinegar advice. My audience right now is you, not them.
So, in all sincerity: I’m bothered and I’m angry. Epstein and Sunstein have some social influence. They could do better. I don’t agree with all their political positions, but that’s a separate point. They could argue their cases in a way that allows for understanding and discovery, rather than in a closed way in which they never examine their own adjustments, they never face their own anomalies. They’re preening to the world, and they should spend some time holding their arguments up to a mirror.
A solid argument could be made that I should shut up about this—not because it’s boring, not because I’m being mean, but because by giving good advice to Epstein and Sunstein, I’m actually giving them the opportunity to be better thinkers . . . and thus to do more damage. Maybe we’re actually better off that the foremost proponent of nudging keeps embarrassing himself, as this discredits that policy a bit.
Ultimately, though, as a statistician and social scientist, so I have some sympathy for Epstein and Sunstein in their quests for evidence-based policy. Not in all the details that they can’t seem to get right, nor in the attitude that we should follow the guidance of rich people and celebrity law professors, but in the Bill Jamesian idea that we can do better through systematic analysis of the social-science variety.
It’s ventin’ time
OK, now that we got the sincerity out of the way . . . I made the mistake of reading Cass Sunstein’s latest column, “Why Coronavirus (and Other) Falsehoods Are Believable,” which states:
The broader phenomenon is something that psychologists call “truth bias”: People show a general tendency to think that statements are truthful, even if they have good reason to disbelieve those statements. If, for example, people are provided with information that has clearly been discredited, they might nonetheless rely on that information in forming their judgments. . . .
OK, fine. But now we’ll get some of Sunstein’s special expertise:
The underlying problem goes by an unlovely name: “meta-cognitive myopia.” The basic idea is that people are highly attuned to “primary information” . . . By contrast, we are less attuned to “meta-information,” meaning information about whether primary information is accurate. . . .
Always good to get a spoonful of jargon with our factoids. The jargon serves a similar function as the stuff in the toothpaste that gives you that tingly feeling when you brush: it has no direct function, but it conveys that it’s doing something.
Sunstein then describes the result of a recent psychology experiment.
What’s frustrating here is that Sunstein does not give any specific examples of erroneous evidential claims that people have believed, even after they’ve been refuted. So I thought I could help out.
Here’s a false claim that got spread around the world:
That’s pretty wrong, actually. As we discussed here, that passage exhibits the scientist-as-hero fallacy, neglect of variation, and a piranha violation—along with the even simpler error that it’s reporting some experiments that never existed.
But if you put the phrase “another Wansink (2006) masterpiece” in a bestselling book, people will keep remembering it—even after the work in question has been refuted.
Here’s another false claim that got some attention:
Here’s another claim, albeit one that hasn’t been disproved, at least not yet:
Then there was this:
Knowing a person’s political leanings should not affect your assessment of how good a doctor she is — or whether she is likely to be a good accountant or a talented architect. But in practice, does it? Recently we conducted an experiment to answer that question. Our study . . . found that knowing about people’s political beliefs did interfere with the ability to assess those people’s expertise in other, unrelated domains.
The study in question never said anything about doctors, accountants, architects, or any professional skills. Shoot . . . I hate when that happens. A false or misleading claim gets out there in the national media, the authors don’t correct it, and it can hang around forever.
Or, hmmmm, anybody remember this:
At this stage, no one can specify the magnitude of the threat from the coronavirus. But one thing is clear: A lot of people are more scared than they have any reason to be. . . . Many people will take precautionary steps (canceling vacations, refusing to fly, avoiding whole nations) even if there is no adequate reason to do that. Those steps can in turn increase economic dislocations, including plummeting stock prices.
You spread an idea like that in public, and people might believe it—even after it’s been refuted.
It might seem odd that Sunstein cares so much about stock prices, but then there’s this from last January:
A simple measure of presidential performance takes account of just two variables: approval rating and the Dow. The argument for APDOW, as we might call it, is that public opinion matters, because it captures the wisdom of crowds, and that the performance of the stock market matters, because it provides one measure of how the economy is doing.
A simple measure, indeed. To be fair, this was in the Opinion section of the website, not the News section.
A way forward
This all seems to be a big problem, the idea that falsehoods can circulate even after they’ve been refuted.
I think what we all need is some Harvard professors to nudge us to doing what’s good for ourselves.
So far, our nudges are:
– Follow the dietary advice of Brian Wansink; the man is brilliant.
– Don’t selfishly be scared about the coronavirus. Be public spirited. Think about the stock market.
And, of course:
– Never ever admit that you made a mistake.
If we can all get nudged in that direction, all should be fine with the stock market and coronavirus and everything else in the world. It will all be rainbows and unicorns with no methodological terrorists, no Stasi, no “ill-considered and graceless” bloggers. Just a bunch of happy celebrities giving Ted talks to each other about nudges, life hacks, and all the ways in which ordinary people need to be saved from their own poor judgment.
What the hell??? Sunstein writes a whole essay on “truth bias” and how falsehoods can stay afloat even after being refuted—and he doesn’t even once refer to his own extensive history in spreading falsehoods. Involuntarily spreading falsehoods—no shame in that: if you write enough things, you’ll make some mistakes—I know I do!—the problem is not in making mistakes, it’s in not admitting the mistakes. It’s hard to learn from your errors if you refuse to acknowledge them. This is a guy who held a high government post, whose whole shtick is that he designs research-based interventions, but (a) he has a track record of believing bad research, and (b) he doesn’t seem to go back and correct his mistakes or, even more importantly, figure out what went wrong with the reasoning that let him get fooled in the first place. I don’t want the government to put this guy in charge of our soup bowls—or our coronavirus policy.