Correspondences: Open Science in Progress
TRANSFORM trial registration could be pending; scientific communications on polygraph bias review
TRANSFORM trial registration: pending, not missing?
This fall, I flagged potentially unethical research on mass screening for the low-prevalence problem of prostate cancer. Investigators of the UK TRANSFORM trial, a £42 million study, plan to recruit hundreds of thousands of men in service of attempting to improve while promoting prostate cancer screening, which may net harm subjects without provably improving overall mortality.
The study appears to lack a clinical trial registration entry. Such pre-registration is a cornerstone of scientific transparency and good clinical practice, helping to prevent problems including outcome switching, selective reporting, and inflated false-positive rates.
Evidence-based UK patient advocacy group HealthSense continued their follow-up with transparency pioneer Dr Nicholas DeVito, postdoctoral researcher at Oxford University’s Bennett Institute for Applied Data Science, who advised:
It is possible that the trial isn’t registered because it hasn’t started yet. Sometimes teams don’t register the trials until right before they are about to start because things could be changing right up until then and they would like a final version for the registration.
In other words, it may just be a timing issue. While prospective registration is required for NIHR-funded trials (trials funded by the UK’s National Institute for Health and Care Research), it’s not uncommon for major studies to appear in registries only right before recruitment begins. The TRANSFORM website says that the team “expects that in early 2025, the first men will be invited to take part in the study.” So we may see an entry appear soon.
Does this make the trial’s current lack of pre-registration copacetic? Not exactly.
One vision of open science is that you can improve the world — and your own research — by putting stuff out there that’s not perfect, so other people can help you make it better.
In the TRANSFORM context, where the chief ethical concern is that healthy people could be encouraged to undergo interventions that might net harm them, the burden is high for researchers to at least consider collaborating with some of their many vocal critics to avoid that outcome through good design, informed consent, and transparency. Registering the protocol before beginning recruitment would have allowed independent experts to scrutinize the design before human subjects are potentially harmed in the study. Once it’s already underway, incentives grow for researchers to make post-hoc justifications for what they have started doing.
But nothing is perfect, least of all scientific research. It’s great that we should learn soon how TRANSFORM researchers are informing participants about possible risks and benefits. It will be interesting to see whether this trial’s design meaningfully addresses the flaws of its predecessors.
Author response on polygraph bias review: deny, distract, degrade
My last post critiqued Whittaker et al’s recent review on racial bias in polygraphy (“Racial biases in polygraphs and their legal implications,” Nat Hum Behav 9, 3–4 (2025); open-access preprint). The authors claim that racial bias in polygraphy is an established fact, explained by racial differences in skin conductance responses (SCR) causing higher proportions of "inconclusive” results, and worsening criminal justice inequities. However, the studies cited do not provide empirical evidence to substantiate these claims.
Notably, the article neither quantifies the effect size of the alleged differences, nor cites evidence that possible racial differences in SCR measures in lab studies generalize to field polygraph contexts. More fundamentally, it skips the deeper problem: it’s not clear what scientific test could establish bias in lie detection in the field, since it’s not clear what scientific test could validate lie detection — period.
Deny
In response to my critique, the review’s senior author, Oregon State University Assistant Psychology Professor Daniel Bradford, dismissed my analysis as merely a “well-written and entertaining” post — calling it an attempted critique and not addressing the substantive issues raised.
At the same time, he denied that his team had made the article’s core, definitive claim, despite its framing of racial bias in polygraphy as an established fact. Instead, he mischaracterized their piece as merely “delivering a warning” about “the limitation in technology.” If that was the authors’ intent, a clearer articulation of it should be undertaken in the journal.
Distract
Bradford introduced numerous tangential points distracting from the central issue, that the article’s core claims are unsubstantiated. One of these tangents offered a correction to my explanation of electrodermal activity (EDA) that was itself incorrect:
"The polygraph community generally agrees that electrodermal activity (EDA, the inverse of SCR, aka galvanic skin response, GSR) is the easiest-to-measure channel..." - EDA is not the inverse of SCR. SCR is a component of EDA. Maybe you want to say something like "The polygraph community generally agrees that electrodermal activity (EDA, which includes both tonic and phasic components, such as skin conductance responses or SCRs) is the easiest-to-measure channel." If you make the change, please feel free to give me credit on your blog : ) [Feb. 12, 2025]
This confusion arises from the various ways EDA and its components are defined:
“Historically, EDA has also been known as skin conductance, galvanic skin response (GSR), electrodermal response (EDR), psychogalvanic reflex (PGR), skin conductance response (SCR), sympathetic skin response (SSR) and skin conductance level (SCL).”
Note that this definition includes conductance (SCR, SCL) and resistance (SRL, formerly GSR), despite the fact that electrical conductance and resistance are reciprocals (inverses). Under this definition, my description and Bradford’s were both incomplete.
Another recent definition similarly includes these inverses, conductance and resistance measures:
EDA is commonly assessed by evaluating skin conductance level (SCL) or skin resistance level (SRL). SCL refers to the measurement of the electric conductivity of the skin in response to cholinergically mediated excitation of the sweat glands (Dawson et al., 2017; McCorry, 2007). SRL is the inverse of SCL and measures the electric resistance of the skin in response to the excitation of sweat glands. Thus, high SRL is synonymous with low SCL.
However, some definitions treat EDA and SCL strictly as conductance measures, summing phasic (rapid-changing) and tonic components like SCR — aligning with Bradford’s description.
So, I should have been more precise: EDA can be measured in terms of conductance, resistance, or both. My original phrasing reflected what I was told years ago by a professional polygrapher, but investigating Bradford’s objection led me to refine my understanding. I would welcome any additional clarifications.
EDA, GSR, SCR, SCL, SRL — these acronyms all refer to fingertip sweat measures. A little terminological confusion on both sides is understandable.
Degrade
Bradford’s condescension escalated to ad hominem attacks when I asked if I could post his entire response on the blog to respond publicly. He denied that permission (email, Feb. 13, 2025).
Whittaker, the article’s first author, did not respond to the blog on X or email, or to being included on Bradford’s correspondence.
Open science in process…
This situation highlights the importance of open science in advancing quality in research. It’s in process…
I should have already published my polygraph bias dissertation research results in a scientific journal, but never got around to trying — and so there is less in the record to help prevent other people going astray like this. Good on me for working on getting the data out there next in this latest series (bias bingo, bias bandwagon, now this).
While her review is based on unsubstantiated allegations of bias, credit to Whittaker for posting the preprint on OSF.
And, while Bradford’s response was flawed, engaging with constructive criticism shows more integrity than ignoring it altogether.
We all have room to improve. Hopefully, we all will. By upholding increasingly ascendant open science norms using increasingly accessible open science infrastructure, correcting the record when it’s wrong, and engaging transparently and constructively with critiques, the scientific community can advance collective understanding and research integrity.