Zeynep Pamuk on the Case for Creating Science Courts
A political scientist discusses the highly charged interplay between the worlds of science and politics.
SCIENCE AND POLITICS intersect on many levels. Many scientists depend on public funding to conduct their research — an inherently political process — and political leaders depend on scientists to inform their policy decisions. As well, the ethical ramifications of scientific research bear directly on ordinary citizens, who depend on governments to determine what lines of scientific inquiry are supported.
But Zeynep Pamuk, a political scientist at the University of California, San Diego, feels the interplay between these two worlds — science and politics — has only begun to be properly explored. Pamuk’s interest in this relationship began early in her career, when she started to examine the discourse surrounding climate change. “I realized that there was great scholarship on climate change, but it didn’t get a lot of uptake,” Pamuk told Undark. “So I became interested in why that was the case. What is it about the intersection about science and politics that’s become so pathological?” She eventually saw that “there wasn’t as much scholarship on that question, especially from within political science.”
In her new book, “Politics and Expertise: How to Use Science in a Democratic Society,” Pamuk outlines new directions that she believes the relationship between science and politics might take, rooted in the understanding that scientific knowledge is tentative and uncertain. Among her proposals is the resurrection of the idea of a science court, an idea first put forward in the 1960s.
The interview was conducted over Zoom and has been edited for length and clarity.
Undark: Much has been written on the importance of scientific literacy, and, especially in the last few years, on the problem of science denial and on the trust, or lack thereof, in science and scientists. But you frame your investigation very differently. What was your starting point?
Zeynep Pamuk: There’s a lot of discussion about denial of science, why citizens are so ignorant, why they don’t understand science. And I wanted to change the conversation, by understanding how the way science is done, how scientific research is conducted, how the choices that scientists and science administrators make — at far earlier stages of the research process — shaped the uptake and framing of the debate. So I think the contours of the debate were too narrow.
UD: In your book, you talk about the idea of scientists taking responsibility for their research. That’s an idea with a long history — one thinks of the atomic bomb, for example, and genetic engineering. How do you see this issue of responsibility for scientists?
ZP: I’m interested in the question from the perspective of how a democratic society deals with the presence within it of this knowledge-producing but fairly autonomous community of scientists. So when I say that scientists need to take responsibility, I don’t mean it in the way that a lot of people said about the atomic scientists — that they could be held morally responsible.
Sure, I don’t disagree with that. But I was more interested in what society could do to regulate these kinds of high-risk scientific endeavors. And I didn’t think that the answer that scientists have to be morally responsible, to examine themselves and restrain themselves — the idea that they self-monitor, that they can be trusted to do that — was a sufficient answer.
UD: Are you saying that science requires more regulation or oversight?
ZP: In certain kinds of very high-risk scientific research, these decisions should be made collectively, or at least by authorized political representatives. They should have more public debate around them. The Obama administration at one point put a moratorium on lethal pathogen research. There’s some coverage, not a huge amount of discussion; and then it reversed its decision three years later. It’s very difficult to find any paper trail about what happened. What was the discussion? What was the reasoning? Did they decide it was now safe?
It’s very hard to know what happened. And it seems like this is hugely consequential on a global, planetary level. So there has to be more discussion around it. This kind just kind of risk decision should not be left purely to scientists. We can assign them responsibility — but it doesn’t mean that they should they alone should be responsible for making this very consequential decision.
UD: Should governments be able to tell scientists that certain lines of inquiry are off-limits?
ZP: I think the answer is yes. I’m not going to say this area should be restricted or that area — I think this is a collective decision. My opinions are my personal opinions as a citizen of a democratic society. But I think more debate is appropriate. And in certain cases, there might be a lot of support for undertaking risky research, because people imagine that it will bring a better world — but in other cases, there are no conceivable benefits. I’m thinking maybe of killer robots, as one example. Or maybe that the benefits don’t justify the risks. So it’s something that would come out of debate. But I think there can certainly be areas where limits should be placed on research.
UD: One very interesting idea in your book is the notion of a science court. What exactly is a science court? How would it work, and what would its purpose be?
ZP: I stumbled upon this idea as I was looking at debates around science in the 1970s. This was a period where there was a lot of debate, because scientists were very influential; the glow of the World War II victory was around them. They had direct influence over politics. And but of course, they disagreed among themselves. And a scientist called Arthur Kantrowitz suggested a science court, basically to adjudicate between disagreeing scientists, so that the public confusion that this caused would just come to a stop.
But he had a strict division of facts and values: This would be the factual stage, and then the values would be discussed later. And for the reasons I just mentioned, I didn’t think that that would make sense. You can’t debate the science independently from the context of policy context or the context of use. And also, I thought this was a fairly elitist institution, with only scientists participating.
UD: But you feel there was something of value in Kantrowitz’s idea?
ZP: I wanted to reimagine it. I took his structure, with different, disagreeing scientists making a case for their own views; but I wanted to have citizens there, and I want it to be a more overtly policy-oriented institution. So the way I imagine it, there would be a scientifically-informed policy debate — like, for example, should we have strict lockdowns, or a less strict Covid-19 policy?
So it would have two clear sides — and then scientists for both sides would defend their views. They would ask each other questions that would help reveal the uncertainty of their views, the evidence that they’re marshalling. And then the citizen jury would be randomly selected. They would bring their own political beliefs, they would listen to the scientists, and they would make a policy proposal, selecting one of the two positions.
UD: But scientists and politicians already argue a great deal. How would a science court be an improvement on the current system, in which there’s already a lot of debate?
ZP: It’s true that scientists constantly argue among themselves, but I’m not sure the scientists have unmediated arguments in front of a public audience. I think that is discouraged within current advisory systems. Maybe the climate experience led to this. But even before that, in the ’70s and ’80s, there was this norm that scientists argue behind closed doors within scientific advisory committees, but then they present a united front when they give advice.
So there’s one authoritative scientific advisory body, and that basically gives a consensus recommendation. So publicly-oriented scientific disagreement is seen to be something that undermines trust in science — that emphasizing the uncertainty will mean anything goes, that scientists don’t know anything. And I wanted to push back against that. I thought a properly organized institution, where scientists are facing one another directly, and not necessarily mediated by politicians who have their own agenda, and who just want to cherry-pick the science that serves it — that could have healthy effects for clarifying the factual basis of this political decision making for the citizenry.
UD: When we think of scientists struggling to present a united front on a topic of great public interest, the current coronavirus pandemic certainly comes to mind. But you argue that a lot of those disagreements were hidden from view?
ZP: We saw this during the Covid-19 pandemic, with the masking advice in the U.S. It was initially presented as, “This is our position: masks do not help; do not wear them.” Fauci said this, the Surgeon General said this, [former White House adviser] Deborah Birx said this — they were unanimous in this. And we did not hear from anybody within the scientific community.
And of course, debates were happening within the scientific community about the evidence for the benefits of masks, but we did not hear the opposing side: people saying ‘Oh, masks are probably very effective,’ or at least, ‘We don’t know that masks are effective, and this is our level of uncertainty.’ We didn’t hear the opposing view at all.
And I think that hurt the case, because it made the reversal very difficult; it made people not trust the masking advisory when it came in, in April 2020. So that was a good example of the kind of thing where a science court would have helped.
UD: But on the other hand, if the public had a greater window onto scientific arguments as they unfolded, maybe they just wouldn’t listen to scientists at all. As you suggested, they might think, “Oh, look — they can’t even agree among themselves.”
ZP: Yeah, I think that’s true. That’s the risk. If people see disagreement, they might think scientists can’t agree. But that usually is the case. But the one thing I will say is, that when you see scientists disagreeing, you also see the scope of disagreement. For example, you don’t see scientists saying “vaccines are ineffective,” or “vaccines are hugely dangerous.” So you see what sorts of things they’re disagreeing on, and that gives you a sense of where the debate is at.
If you overstate what scientists know, where the consensus lies, then there is a chance — and this happens all the time — that it will turn out to be wrong. And I think that undermines public trust even more than a candid admission that, at this point in time, scientists are disagreeing on a certain point.
UD: But, wouldn’t having ordinary citizens act as arbiters in scientific disagreements bring us back to the issue of scientific literacy? For example, if some members of the public don’t understand the difference between a virus and bacteria, then they’re in a very poor position to evaluate strategies for fighting infectious disease — right?
ZP: Yes, I agree with that completely. I think improvements in scientific literacy would be critical for an institution like this to succeed. Then the question is, how much literacy? I think we can have a citizenry that is more literate about the scientific method, about the difference between viruses and bacteria. But that still wouldn’t mean that they’d become experts, or that they would need to have a Ph.D. to participate in the science court.
By Dan Falk
How to Use Science in a Democracy
In the classic work Democracy and its Critics, Robert Dahl said Plato made the most compelling case against democracy.
Science in a Democracy
Most of us recall Plato imagined a republic where a philosopher king ruled over an orderly utopia. For most of us it’s difficult to take seriously the idea of a philosopher king today. Many conservatives even disdain academics and philosophers. As William F. Buckley famously said, “I would rather be governed by the first 2,000 people in the telephone directory than by the Harvard University faculty.”
Nonetheless, the idea of a philosopher king raises many challenges even for those who view it as desirable. Perhaps the most obvious difficulty is selection. Of course, Dahl did not seriously think the idea of a philosopher king as attractive for a modernized society. Rather he saw the rule of experts or guardians as the more realistic alternative. Indeed, many political thinkers have considered various incarnations of technocratic governance. Even democratic governments have incorporated features of technocracy through a trained bureaucracy, central banking, and professional military.
Regardless, the relationship between expertise and democracy is always tense. At no time has this been more true than during the pandemic. Many communities have rebelled against mandates from public health professionals like masks, school closures, and shutdowns. Zeynep Pamuk admits, “The partnership between democracy and expertise is intrinsically unstable.” Of course, it’s easy to dismiss public dissatisfaction with public health officials as populism rather than democracy. Indeed, some of the loudest critics of public health officials defend some of the most egregious assaults on democracy such as the riots of January 6th.
A recent book from Zeynep Pamuk, Politics and Expertise, explores the role of science in a democracy. It’s among the most insightful efforts to explain how democracy can become participatory, deliberative, and epistemic. Let me explain…
Traditional theorists believed democratic governance required common experiences and ideas. Homogeneity made democracy possible. Thomas Jefferson, for example, imagined a republic of citizen farmers partly because they embodied the independence necessary for self-governance, but also because specialization was considered antithetical to democracy. Democracy demands a generalized knowledge from its citizens so they can participate in a wide variety of political questions. The division of labor in society inevitably implies a specialization in governance itself. It closes off the realm of government to specialists who become de facto elites.
Theories of epistemic democracy, on the other hand, embrace diversity in experience and knowledge. It views specialized knowledge as a contribution to the community. Moreover, it encourages deliberation to share different perspectives so the community can make the best decisions possible. In other words, it argues for a heterogenous or cosmopolitan form of democracy where different perspectives strengthen the collective decision-making process. Political theorist Hélène Landemore writes, “It is inclusive deliberation of all on equal terms followed by inclusive voting on equal terms that offers us the safest epistemic bet in the face of political uncertainty.”
Nonetheless, epistemic democracy does not merely defer to experts, but rather demands participation from citizens with different perspectives to provide a more complete understanding of problems. It’s not enough to simply follow the science. Doing so would only produce a single homogenous view of knowledge where opinions converge through education. Zeyned Pamuk explains, “While the public understanding of science is clearly important, starting from this question presupposes that the appropriate role of nonexperts has already been settled, and the primary goal is to inform and educate them about science.” Epistemic democracy views knowledge itself as heterogenous. This distinction opens space for nonexperts to make meaningful contributions based on their own experiences and perspectives.
Pamuk tends to place a heavy emphasis on the roles of experts, citizens, and political leaders. She argues citizens do not need to ignore science to make meaningful contributions in a democracy. Political decisions involve questions of values and priorities. They often involve tradeoffs between competing goals or objectives. Scientific advice may point to ways to achieve collective goals, but it does not resolve important questions of values or priorities Moreover, Pamuk recognizes, “Expecting scientists to discern and use social and political values in their research would be to assign scientists a duty of political representation. This is a role for which they are neither qualified nor properly authorized.”
During the pandemic public health professionals offered important guidance to bring down the rate of infections, reduce hospitalizations, and avoid unnecessary deaths. Many politicians followed their advice without any debate. Many justified their policies as following the science. Others attacked experts as elitist. However, some raised valid questions about school closures and lockdowns based on other priorities or values. Please note I am not making a case for or against these policies. Rather my point is democratic deliberation does not end with a scientific recommendation.
Political polarization in the United States has turned into a contest between those in support of science against those who ignore it. Neither of these approaches is healthy for democracy. Pamuk makes the case that science will receive more respect when citizens understand its value and its limitations. Moreover, science rarely offers the certainty policymakers and the public want it to deliver. Rather than simply understand science better, it’s more important for the public to understand degrees of scientific certainty and areas of dissent. For example, scientists recognize the reality of global climate change, but disagree about the best ways to approach it. The disagreements do not challenge the existence of global warming, but oftentimes reflect differences in policy preferences or even values. For example, many scientists advocate for greater resources devoted to adaptation rather than the mitigation of its effects.
The decision citizens must make is not whether to support or abandon science. Zeynep Pamuk shows science is often a starting point for further deliberation in a democracy. Critics of science miss this important point. Too often they refuse to listen to the advice of experts precisely because they are experts. This is not only misguided, but also dangerous for good governance. Pamuk makes a much more reasonable request, “Even if we grant pessimism about politicians, we should resist idealizing science and scientists at the same time.”
Zeynep Pamuk joins the podcast to discuss her book, Politics and Expertise: How to Use Science in a Democratic Society. Look for it tomorrow or subscribe to the Democracy Paradox on your favorite podcast app today.
Ivan Cerovac (2020) Epistemic Democracy and Political Legitimacy
Simone Chambers (2018) “The Philosophic Origins of Deliberative Ideals,” in The Oxford Handbook of Deliberative Democracy
Robert Dahl (1989) Democracy and Its Critics
Carolyn Hendricks, Selen Ercan, and John Boswell (2020) Mending Democracy: Democratic Repair in Disconnected Times
Hélène Landemore (2020) Open Democracy: Reinventing Popular Rule for the Twenty-First Century
Zeynep Pamuk (2021), Politics and Expertise: How to Use Science in a Democratic Society
Steven Pinker (2018) Enlightenment Now: The Case for Reason, Science, Humanism, and Progress
Plato (4th Century BCE) The Republic
Susan Rose-Ackerman (2021) Democracy and Executive Power: Policymaking Accountability in the US, the UK, Germany, and France
Hans Rosling with Ola Rosling and Anna Rosling Rönnlund (2018) Factfulness: Ten Reasons We’re Wrong About the World–and Why Things Are Better Than You Think