When I was six, my parents took me and my brother to Opryland, a theme park in Nashville, Tennessee that has since been replaced by a mall. When we rocked up to the magic show, I happened to see the switch under the magician’s table. Being a curious kid, I asked him what it was. He told me to stop asking questions, or he’d make my pants disappear — foreshadowing a lifetime of quietly murmuring “Die, motherfucker, die!” to my shame, so I can keep asking questions you’re not supposed to ask.
Questioning lie detection in a national security context in the U.S. is forbidden, so I sued a bunch of federal agencies for their polygraph program info as part of my graduate research (and partly won). Questioning doctors is taboo in the hierarchical world of medicine, so I made it my mission as a patient advocate to help people with difficult, uncertain diagnoses navigate their struggle to get needed care. Questioning what women are told, and tell each other and ourselves, about our life choices is another natural, sacred cow for me to spear and grill up.
How does infant feeding science work as social control? How about research on risks of antidepressants in pregnancy? What’s the space between what we’re told about the risks of abortion, and what the data really say — and what does that space reveal about the power at work in conversations ostensibly meant to empower women? What’s the deal with propaganda masquerading as science, and how do you get the boot off the neck when people just see numbers and do what they’re told?
There are many funny things about making my own life hard by asking these sorts of questions. One is that they tend to draw attention to me, and I have been at times too shy to go to the grocery store. Another is that the way I work problems by nature — obsessively — tends to make me an expert on them. Yet, like a lot of seemingly smarter people, I tend to see that I’m stupid but learning. I think it’s that beginner’s mind that lets me ask forbidden questions.
There’s a lot of confusing dominance (might) with expertise (right), especially in realms of power like law enforcement and medicine. Not coincidentally, these fields self-select for dominant people. But the confusion is everywhere, just like power. Much could be said about this, and has been by famous anti-authoritarians from J.S. Mill to Jon Stewart.
I like to think this dominant (hah) dominance-expertise confusion is part of a cosmic joke in intelligence itself: The smarter you are, the more aware you are that you, too, are a poor, stupid fuck just like the rest of us. I don’t trust people who aren’t nervous about this sad fact. I suspect it helps explain why we apparently can’t solve collective action problems to save the global ecosystem.
As political scientist Philip Tetlock and others have shown, subject-area experts tend to be confident and wrong. Some might go so far as to say they’re gaming a social system that rewards macho bluster, at a time when society should be out combing the gutters for uncertain dweebs to save our civilization. The few — the proud — the poor, stupid fucks who know it.
Lately I’ve been thinking a lot about the poor, stupid fuck effect — also known as the Dunning-Kruger effect. Smart people thinking they’re too stupid to say something, and dumb people thinking they’re so smart, they talk a lot. Helps explain why the media and social media landscapes look the way they do, no?
This highlights an obvious conundrum: Apparently, now I think I’m smart enough to say stuff. But I’m also smart enough to know I’m stupid, and I’ve probably got something wrong. And being fundamentally neither stupid nor smart — but merely obsessive — I’m probably going to figure it out later. And it’s going to hurt.
The saving grace is: “I've gotten used to it. So used to it, in fact, that I actively seek out new opportunities to feel stupid. I wouldn't know what to do without that feeling. I even think it's supposed to be this way.” So says Martin Schwartz in “The importance of stupidity in scientific research.”
I like this feeling so much — of anticipating the pleasure of learning, after the pain of learning I was stupid — that I’ve coined a new word for it: die Fehlerlernvorfreude. The joyful anticipation of learning from your mistakes. It feels so much better to learn why you were wrong — than to know you were wrong somehow (if you’re human), but not know why. To know that I am going to learn from it and so get smarter, helps me cope with the anxiety of knowing that I’m going to make mistakes — which is an anticipation of the pain of the shame of being stupid, to be German about it.
This was the way I felt about my dissertation research, even though I did everything by the book and had it checked by external experts. Everyone encouraged me to publish this huge swath of original work, and I felt it wasn’t time. It wasn’t right. I hadn’t gotten it. And I was right. I had to take a break, work on other projects, and relearn a lot of what I had been taught about methods (that is usually taught wrong), to figure out why my intuition was right — to see and learn from my mistakes. Except only then did the existential terror give way to the joy of seeing that I was right, that I was wrong — and now I can be less wrong. You might say I can trust my gut.
What if I’m not the only one who’s felt this way? There’s talk of a crisis of widespread problems including common methods mistakes and outright fraud in scientific and medical research. The “file-drawer problem” is part of it, where there’s evidence of selective results reporting of significant versus null results (aka publication bias).
Proposed solutions often include better open science infrastructure that lets researchers put IRB (Institutional Review Board) protocols, study instruments, data, code, and more online early and often. I’m down for it. I wish with every cell of my body that this had existed when I was a grad student. It was hard to share stuff like this (still would be hard now, though it’s gotten easier). And there’s no reason technologically at this moment in human history that it has to be.
But there’s not a lot of acknowledgment of the role of quiet self-awareness of limitations in all of this. Toon Tellegan calls it, in his beautifully gentle terms, “unspoken shyness.” When there’s a pause in a conversation, scientific or otherwise, we shouldn’t necessarily interpret it as something nefarious. When there’s uncertainty, often we should accept instead of automatically denying or trying to solve it. This is a hard trick.
Lie detectors commonly misinterpret internal states on different axes than truthfulness-deception (like calm-anxiety and comfort-pain) as signs of guilt. Physicians commonly dismiss patients with difficult diagnoses like lupus as well or crazy. Meta-scientists (scientists studying science) often interpret the file-drawer problem as a sign of foul play. But in all these cases, something uncertain is being read as something nefarious. The unknown — the “trick” in “trick or treat” — is the scariest thing we know. It reminds us what poor, stupid fucks we are.
All we can ever do — all scientists and other human beings have ever done — is accept our own limitations and learn. Make another mistake. Figure it out. Make another. And go on. Sometimes quietly. Sometimes not.