Hello and welcome to Wilde Truth: Research Methods for Revolution. I’m looking forward to sharing my writing at the intersection of science and activism every week or two.
My name is Vera Wilde, and I’m a scientist (PhD), author, and transparency activist. I’ve also sold some paintings, and published and occasionally performed humor and poetry, including in the Poetry Brothel Berlin.
My personal experiences — as the daughter of a then-undiagnosed lupus patient, as a patient and citizen myself, and above all as a woman — have been the most valuable part of my scientific education. They got me interested in decision-making tech, helping diagnose my mom and sparking my interest in bias, leading to a PhD dissertation and postdoc funded by National Science Foundation grant and fellowship support that took me UCLA and Harvard. I was pitching police departments nationwide a database project to increase fairness and transparency while Ferguson burned.
Since leaving academia and the U.S. in 2015 to make art and travel the world, I’ve continued publishing on lie detection and been quoted as a lie detection expert in Wired, McClatchy Newspapers, and elsewhere. Following an artist residency at Dutch hackerspace Hack42, I fell in love with a hacker and resettled in Berlin.
When journalist Ann-Kathrin Nezik of Die Zeit (the German New York Times) interviewed me about the AI “lie detector” iBorderCtrl in late August of 2020, I was nine months pregnant with my son. Two months later, life took an unexpected turn when I realized as a new mom that breastfeeding, bottles, and formula are misunderstood technologies. And the way we think about breastfeeding makes the same mistake as the way we think about lie detection — the naturalistic fallacy that nature just gets it right. In reality, there is no unique lie response to detect — and there is no infant survival guarantee in nature. Far from it. Increasing those odds by feeding babies enough milk early and often makes sense. Indeed, teaching women to starve their babies is dangerous and surprisingly recent pseudoscience. And calling bullshit is my specialty.
My first article on common and preventable harm to newborns from the way we do breastfeeding today was published in the medical journal Cureus. As was the next one, on the related association between neonatal jaundice — a common complication of starvation from breastfeeding gone wrong — and autism. This got me deep into the methods weeds, fascinated by statistics reform efforts and their implications across medicine and science. There are revolutions afoot here — and I love revolutions.
What are research methods anyway, and why are they revolutionary?
Research methods are ways of seeing and listening to the world that help us observe and make sense of it better. Usually, people who study people start by learning research design, including something about different ways of making meaning from different sources — interviews, texts, surveys, experiments, and newer stuff like open-source satellite images and social media platform scrapings. Then you move on to stats.
Some people will tell you that research methods is about stats, the whole stats, and nothing but the stats. These people are statisticians who want to get invited to more parties.
Methods dons like Sander Greenland and Miguel Hernán have a bigger beast in mind. Cognitive bias pervades thought, shaping how we interpret data, and we have to deal with it early and often, or suffer the consequences — lost lives, liberty, jobs…
Questions about cause and effect, rather than description, should also be at the center of statistical inference. The truism is true: correlation doesn’t mean causation. But we want to know about causation, so we should go for that and say so.
To research design, stats, cognitive science, and causal inferences as part of what we mean when we talk about methods, I would add information freedom. It has two sides — when you’re investigating a subject, you can request records from public institutions under Freedom of Information (FOI) laws and regulations in many places. And when you’ve got information, you should make it freely available to others on the first principle of valuing the truth.
So that’s what I mean by research methods. Why are they revolutionary? The most obvious answer is that they empower people to question illegitimate authority. Just like reading the Bible after Martin Luther proposed his 95 Theses, it doesn’t take a scientific priest to save your soul if you can read Table 2 yourself. But, as usual, what I’m most interested in here is much more transgressive. I’m interested in uncertainty.
Of the three maxims carved above the entrance to the Temple of Apollo at Delphi, most people who know this sort of thing, know only the first one, often called the Socratic maxim: “Know thyself.” The others are “Nothing in excess” and “Surety brings ruin.” These maxims summarize modern statistics reform efforts. Unpacking that’s another post, but suffice it to say that changing the way researchers report and interpret results to embrace instead of rejecting uncertainty is key.
This is what leading statisticians had in mind when they recently called en masse for an end to statistical significance test thresholding and misinterpretation. This means scientists should stop emphasizing point estimates or, worse, binary results of statistical significance tests. (This also applies to other statistical tests, in principle. Coming soon: down with funnel-plot tests for publication bias!)
Instead, we should emphasize analyses’ 95% confidence intervals, and the possible risks and benefits they imply. Paint the target, not the bullseye.
This is, I think, what meta-scientist Hilda Bastian is getting at in her PLOS Blog entitled “Absolutely Maybe.”
In the immortal words of Modest Mouse, we need to become “certainly uncertain.”
This is revolutionary, because it upends how we think about expertise and doubt. You know the old adage, “Fake it til you make it”?
Forget that. This is about making it without faking it. Admitting — nay, insisting — that what we don’t know is a key part of what we do know. And making sense of — instead of denying or trying to solve — a world that is, and will always be, full of uncertainty.
Uncertainty that can do a lot of damage if we don’t hold it out in the light, say its name, and think through what it means for the public interest.