Ideological Mandates in Publishing: A Comment on Nature Human Behaviour’s Guidelines for Publication
by Medhini Urs
Note: My name is Medhini Urs and I am a Cognitive Science PhD student. I started writing this essay a week after the Nature Human Behaviour editorial came out a couple of months ago. This semester (Fall 2022) I am enrolled in a seminar course as a requirement for a program on the role of bias in AI and social science. When faced with the opportunity to present on a topic related to biases, I decided to talk about my ongoing thoughts expressed in this essay with my fellow PhD students on ideological biases, censorship, and their consequences in science. The Nature Human Behaviour editorial was sadly a really good example for elaborating my thoughts on how ideological mandates are antithetical to scientific inquiry. After the seminar, I reworked the essay to reflect the order and content of what I presented to my peers. I am very glad to have gotten the chance to share my thoughts with them on this topic and wanted to share the essay with a bigger audience. I hope you enjoy reading the essay and feel free to reach me at medhiniurs@gmail.com if you have any questions or want to chat further.
The title of the editorial “Science must respect the dignity and rights of all humans” is hardly disagreeable. However, it does a poor job in describing the editorial’s substance. The pleasant title obscures the fact that the journal has sadly adopted the position that ideological views of potential harms and negative outcomes may be allowed to dictate what can be published, modified post-publication, or even retracted.
I believe the true essence of the article was evident in a twitter thread where the editor of the journal, Stavroula Kousta, articulated that she disagrees with the notion that science should only be evaluated on scientific soundness and merit.
There are many excerpts in the Nature Human Behaviour editorial which illustrates that the journal has unfortunately put external goals and outcomes of “preventing harm” as conditions for publication and retraction.
THE EDITORIAL’S “MAIN IDEA”:
“New ethics guidance addresses potential harms for human population groups who do not participate in research but may be harmed by its publication.”
AUTHORS’ ATTEMPT AT DEFINING HARMS FROM PUBLICATION:
“Yet, people can be harmed indirectly. For example, research may — inadvertently — stigmatize individuals or human groups. It may be discriminatory, racist, sexist, ableist or homophobic. It may provide justification for undermining the human rights of specific groups, simply because of their social characteristics.”
“Harms can also arise indirectly, as a result of the publication of a research project or a piece of scholarly communication – for instance, stigmatization of a vulnerable human group or potential use of the results of research for unintended purposes (e.g., public policies that undermine human rights or misuse of information to threaten public health).”
HOW TO PREVENT HARMS ACCORDING TO EDITORIAL:
Amendments and retractions
“Advancing knowledge and understanding is a fundamental public good. In some
cases, however, potential harms to the populations studied may outweigh the benefit of publication.”
“Editors consider harms that might result from the publication of a piece of scholarly communication, may seek external guidance on such potential risks of harm as part of the editorial process, and in cases of substantial risk of harm that outweighs any potential benefits, may decline publication (or correct, retract, remove or otherwise amend already published content).”
Privileged Perspectives: example of a harm that can be subject to amendment and retraction
“[Academic content that]...promotes privileged, exclusionary perspectives raises
ethics concerns that may require revisions or supersede the value of publication.”
“[editors reserve the right to request modifications to (or correct or otherwise amend post-publication), and in severe cases refuse publication of (or retract post-publication)]...submissions that embody singular, privileged perspectives, which are exclusionary of a diversity of voices in relation to socially constructed or socially relevant human groupings, and which purport such perspectives to be generalisable and/or assumed.”
2) Consulting those with political agendas to decide how things are stated, what should be published:
“We commit to using this guidance cautiously and judiciously, consulting with ethics experts and advocacy groups where needed.”
The various excerpts from the editorial underscore the need to highlight David Hume’s argument in his 1739 work, A Treatise on Human Nature. Hume put forth a strong argument that facts or empirical observations about the world (positive statements) are separate from what the world should be (normative statements). Just because one knows what “is” does not mean that one knows what “ought to be”. This distinction, known as Hume’s Law or the Naturalistic Fallacy is important for scientists, moral philosophers, and especially those who try to be both. Another important distinction for scientists is, what “ought to be” should not obfuscate what “is”. In other words, we should not allow ideology to dictate which scientific findings can be published.
Ideological mandates (such as the ones echoed in the Nature Human Behaviour’s new guidelines) lie directly in opposition to scientific inquiry, integrity and progress. Why? Because if history is a guide, when ideology has captured the scientific enterprise, knowledge is selectively championed or censored to serve an end-goal. Empirical evidence becomes subordinate to the “good intentions” of the ideology.
For example, in the Soviet Union, scientists were elevated or diminished according to how beneficial their findings were to the noble causes of the Marxist-Communist ideological framework. The cases of Ivan Pavlov and Trofim Lysenko demonstrates this clearly.
Ivan Pavlov was not a fan of the Bolshevik Revolution or the Communist Party and yet his work was still very much respected by the Soviet leaders. Perhaps it was because Pavlov’s findings in classical conditioning and stress responses were potentially helpful in the Soviet State’s agendas. According to Joel Dimsdale and Edward Hunter, Lenin realized that Pavlov’s discoveries could help him shape the minds of Soviet citizens. This was a beneficial end-goal to Lenin and the Party so funding for Pavlov’s lab was increased and he was allowed to experiment on psychiatric patients. The fickle nature of ideological censorship can be noted in the irony of Pavlov being celebrated for his discoveries of classical conditioning under one regime, but his influencer, Ivan Sechenov, being repudiated by the previous Czarist regime for findings that threatened Christianity’s view of free will.
The leaders of the Soviet Union were also fans of Trofim Lysenko’s Lamarckian approaches to agriculture and suppressed dissenting viewpoints. Lysenko’s ideas supported the Marxist principle that the environment is the one true determinant of outcome and that any inherent advantage or disadvantage can be erased through the right environmental conditions. Lysenko even rejected evidence of chromosomal inheritance and called the concept of heredity the product of “Western imperialist oppressors” and “Bourgeois scientists”. As the director of the Institute of Genetics, Lysenko controlled the scientific establishment and dictated what was allowed to be published or discussed. His policies led to the imprisonment and death of scientists who did not share his views or identified the flaws in his methods. Millions of people died in famines resulting from Lysenko’s unscientific but ideologically convenient policies. He was not working in service of discovering the truth using scientific principles but in service of a worldview of how things should be.
We all have biases with regards to how the world should be and how we should play our part in shaping it. Some of these biases may be shaped by our political, religious, or cultural worldviews. As explained in a brilliant essay by Stuart Ritchie, scientists, regardless of their biases, must try to be as objective as possible. To obtain this ideal, we must have robust data collection and sampling procedures, use reliable and valid methodologies, and be precise when reporting findings by providing context. Rigorous studies allow us to have productive discussions about their strengths and limitations, paving the way for progress in novel and promising avenues.
The intended purpose of peer-review in scientific journals is to weed out egregious design and methodological flaws leading to faulty conclusions. Peer-review is hardly a perfect system and there is slippage of papers containing fraudulent data or invalid methodologies. In these cases, retraction is warranted and necessary. However, what scientific journals - the current middlemen and gate-keepers of scientific knowledge - should not do is engage in noble lies. My main questions to the editorial board of Nature Human Behaviour are:
What if a study is methodologically sound but is inconvenient to certain outcomes?
What if the findings generate large outcries on social media?
What if there are petitions to have a paper retracted on grounds that it offends individuals or might embolden certain factions of society to harm certain individuals?
It is a misguided effort and beyond the scope of scientific publishing to try to control the harms of potential policies or how everyday people interact with one another. As Judea Pearl tweeted, it is not easy determining “harm” and “benefit” from simple hypothetical scenarios, far less when that sort of calculus is performed on complex real-world scenarios. Additionally, the vagueness of the terms in the new policies and lack of precise implementation protocols seem to make the calculus much harder and more prone to error.
The fact that the editorial explicitly mentions involving advocacy groups in the publication process is alarming. Why should advocacy groups, which have their own agendas play a role in whether empirical findings can be stated or published? A list of groups is shown below. Would it improve trust in scientific institutions if people knew that research was vetted by any of these groups beforehand? Bo Winegard in his perfectly cutting Quillette article makes an excellent point by stating how it would look if pro-life advocates are consulted before publishing an article relating to abortion and well-being. It is not clear which advocacy groups will be consulted and under what circumstances their consultation would be necessary. In my opinion, the involvement of advocacy groups in the editorial process constitutes conflicts of interest (something that is generally frowned upon). We should remember that what certain ideologies dictate the world “ought to be” should not supersede what “is.”
Scientists and editors are human and have ideological bends and biases. If they do not like the downstream consequences of scientific work relating to policies, social interactions, or global trends, they are free to engage in discussions and other democratic processes. Whether their aim is to help shape public policy, to help clarify the findings, they should be able to talk, vote, and advocate for change. But they should do these sorts of outreach work outside the realm of scientific publications. Be the activist you want to be, but please stick to good methods and empiricism when it comes to doing and reporting on science. Don’t ideologically screen out studies during the publication process.
We must remember that science is an ever-evolving set of ideas obtained through systematic approaches to understand phenomena. And, these sets of ideas should be open to criticism and debate supported with empirical evidence. Evidence can challenge and sometimes topple dominant scientific constructs or notions. Whether it is the scientists who proposed a dominant theory or for those whose social, political, economic beliefs were supported by that theory, evidence can be very inconvenient to those with vested interests. But not publishing scientific studies or retracting studies for “good intentions” socially, politically or economically is unjustifiable.
We are not living under a totalitarian regime of suppression like the Soviet Union that is throwing scientists in prisons or sentencing them to death for challenging the state’s worldviews. However, as Lee Jussim has written, there are many more cases where ideological suppression trumps scientific evidence and discourse, and not just in publishing. Anna Krylov has described many parallels between her experience growing up in the Soviet Union and current day academic practices in the West. She writes “I witness ever-increasing attempts to subject science and education to ideological control and censorship. Just as in Soviet times, the censorship is being justified by the greater good.”
As previously mentioned, scientific journals are the middlemen and powerful gatekeepers who play an important role in the dissemination of scientific knowledge. In the behavioural sciences, in addition to the replication crisis, we have a publication bias known as the file drawer problem. Well conducted, methodologically sound studies with null-results face difficulty in being published which affects meta-analyses, systematic reviews, and the overall state of knowledge (both present and future). Let us not add an additional bias to the existing problem by letting vague ideological notions determine what can be published or remain published.
Good points Medhini Urs, thank you for the article.
Here is related analysis by Rauch:
https://www.thefire.org/news/nature-human-misbehavior-politicized-science-neither-science-nor-progress
He highlights similar problems with NHB editorial and gives compelling examples of how the idea of "harm" can cause real harm.