We are in the midst of a public crisis of confidence [1] in academic expertise. Whether the issue is climate science, epidemiology or economics members of the public are often distrustful of expert opinion. This is often depicted as primarily an issue on the right it occurs on the left as well (consider opposition to GMO organisms, belief that RoundUp is meaningfully carcinogenic and a host of other issues). Solving this problem requires more than just telling people to `Believe Science’ because people are reacting to valid epistemic concerns about the trustworthiness of the information they recieve from experts. At the same time, we face an internal crisis of credibility in many scientific fields with results in many fields rendered unreliable by issues with p-hacking [2].
Real Reasons For Distrust
The problem faced by members of the public is that it’s not enough to know that the experts aren’t lying — trust requires believing that they are fairly presenting the full context and presenting the best counterarguments that someone who understood the subject and shared the values of that media consumer would make. For instance, consider the fact that during the pandemic many epidemelogists went to the media to warn about the dangers of COVID transmission during various conservative protests but then stayed quite about the issue during BLM protests. Of course, one can perfectly reasonably believe that the social importance of the BLM protests justified the risk incurred but not the earlier conservative protests. However, this perfectly illustrates the problem — if your values differ from those held by most epidemiologists then respecting the warnings they give puts you at a disadvantage [3].
Or consider the issue of anthropogenic climate change. I have a high degree of confidence in the science on this because I’ve gone and looked at some of the scientific papers and saw that Professor Mueller’s skeptical re-examination of the data eventually came to the same conclusion but most of the public doesn’t have those options [4]. But how does most of the public distinguish the strong science about anthropogenic global warming from the vast array of enviornmental warnings which make vague appeals to the value of biodiversity and only mention the enviornmental harms of some industrial activity but don’t fairly weigh the degree of the harm or consider the potential benefits. Or what about the fact that, just going off the media reports/abstracts of major studies, you’d think that the evidence shows both strong evidence of bias against women in STEM academia and against blacks by the police. Yet, when you dig into the papers you find that while the evidence showing worse treatment of blacks in traffic stops [5] is impressively robust (imo) the papers claiming bias in STEM academia often only find the opposite in the primary analysis and are reporting sub-group analysis (not to mention the various results, including adversarial collaborations, showing an advantage for similarly situated women [6]).
An Oath For Academics
This problem is roughly analagous to that faced by doctors. People need to be able to trust that their doctor isn’t manipulating what information they share for their own benefit, e.g., failing to mention other potential therapies which might not be as renumerative for the doctor or slanting their advice to benefit other patients. While not perfect, part of the way doctors reassure patients is by taking an oath and having norms that prevent them from, say, only mentioning the treatment option which preserves a patient’s organs for donation to others.
Therefore, I propose that academics take their own oath. I’d love suggestions on the exact wording but it should include the following aspects.
A duty to fully and fairly represent the totality of their expertise — including any uncertainty — without regard to considerations about it’s political impact.
Concern that certain findings might be misused or distorted to justify undesierable beliefs is never a reason not to share them — but may justify contextualizing, cautioning etc..
An obligation to always distingush when they are speaking in their role as an unbiased expert or their personal capacity as a concerned individual.
A duty to try and correct misunderstandings and misrepresentations of their subject by the public without favor or bias — whether or not they agree with the conclusion it is used to justify.
A commitment to the truth in all things — especially in publication — and they swear to only publish in ways that serve to increase understanding and never to engage in p-hacking or other practices which render their publications misleading.
A duty to never hide results because they are unfavorable or don’t demonstrate the effect they wished to find. Even if the results can’t be published in a journal they should be documented publicly — along with any reason to believe they may not be trustworthy.
A duty to fairly present the best evidence against any view advanced whenever there are any reasonable grounds for doubt.
—
1: It feels like this is a recent phenomenon but I don’t have the evidence to support that claim. I’ll only say that there are large swaths of the population who don’t feel they can rely on what experts tell them about what’s true in their field to the detriment of public policy.
2: Or, more generally, as Andrew Gellman refers to it the garden of forking paths to include unintentional as well as intentional attempts to publish results that don’t necessarily help bring our understanding closer to the truth.
3: Imagine the possible worlds where we saw only one of those two protests. In the world where the conservatives listened to those experts they stay home in the world with the conservative protest while the BLM protestors turn out in the world with only that protest.
4: Also, I benefit from having a personal connection here — my wife is good friends with one of the Professor’s daughters — and based on my interactions I’m quite confident that Professor Mueller would have loved to have reached the opposite conclusion if that’s what the data supported.
5: It’s important to note that this bias is strongly supported in the lowest stakes interactions (traffic stops) but there isn’t strong evidence for bias in the use of deadly force. I’m unsure whether or not the later incorporates the effects of earlier treatment, e.g., even if police are no more likely to use force against black and white suspects engaged in the same behavior is it possible that the situation escalates more frequently for black individuals because of the differences in treatment.
6: Of course, one can still argue that even if hiring, publication and conference practices don’t show any gender bias women are pushed out of the pipeline at earlier stages because of some kind of bias. However, that’s a different claim and it’s unclear if it’s bad in the same way. Probably depends on how it happens, but it’s a very different claim and suggests very different policy responses, e.g., programs to favor hiring women for STEM professorships only unfairly reward the women lucky enough not to have experienced effective dicouragement (or encouragement in another area) and do nothing to help those women who dropped out early.
When I was in gradual school, I heard Prof Wolfgang Stolper explain that there are sins and there are professional sins. It is a sin for either a bishop or an academic to lie or to seduce a virgin, but it is a professional sin for a bishop to seduce a virgin and a professional sin for an academic to lie.
Have you read Timur Kuran's *Private Truths, Public Lies: The Social Consequences of Preference Falsification*? I fear that Kuran may have hit that unhappy spot where he has produced a
text which is too turgid for the general reader while lacking the rigor that the academic reader wants, but economists are an odd sort of academic, and it is hard to know what they will like next.
(And Professor of Economics and Islamic studies is an unusual combination --which made me hopeful for reading a rather different perspective on the problem, rather than the same *blah blah misinformation studies* you can read anywhere these days. Yep, it's different, but alas not all that falsifiable, as is all history.)
I think there is a great deal of explicit preference falsification going on in academia and it would be nice it any proposed hippocratic-type oath targetted it explicitly.