Science is a powerful method of gathering information. But as you have seen throughout this course, information is only one input into the policymaking process. When making a choice between multiple policy options, the facts themselves may not suggest which is better. Rarely can the costs and benefits of a policy be fully separated from moral and cultural judgments. While scientists may strive to be objective, various societal influences on research can also result in scientific evidence and advice that does not equally serve all communities, which impacts how policy is formulated. In this module, we will be exploring these limits on science in policymaking in a democratic society, where technical expertise must also work with popular opinion, as represented by elected officials and public participation. 

In conversations about how science interacts with broader social systems, it’s important to realize people may refer to different, though related, concepts when using the word “science”. Science is often defined as a system of creating knowledge, defined by approaches operating under a scientific method. (The scientific method is also complicated to adequately define, but that is outside the scope of this module.) But people may also use “science” to describe the collection of facts and observations produced by this system. Both of these usages are important to consider. In the first definition, science as a practice by humans will always be subject, intentionally or unintentionally, to broader social influences. In the second, it is important to consider what facts and observations have been made, what have not been made, and how groups may influence or operate with the available information.

Ethics and Trust

The scientific enterprise is built on a foundation of trust. Society trusts that scientific research results are an honest and accurate reflection of a researcher’s work. Researchers equally trust that their colleagues have gathered data carefully, have used appropriate analytic and statistical techniques, have reported their results accurately, and have treated the work of other researchers with respect. When this trust is misplaced and the professional standards of science are violated, researchers are not just personally affronted—they feel that the base of their profession has been undermined. This would impact the relationship between science and society.

For people to trust scientific results, they have to trust scientists and how they work. And for the government to support science through the various mechanisms discussed in other modules, the public will need to trust that scientists work in the public interest. Fortunately, scientists are one of the most trusted groups of people in the United States. This makes it especially important to protect that trust by acting ethically. You can consider this module an extension of scientific ethics, which you learned about for research in your discipline, to the responsible conduct of science policy. 

Public Input and Impacts

One of the clearest ways public thought affects science is through governmental action. For instance, since 2016, Congress has essentially banned genetic engineering of human embryos for research in the United States due to concerns over the risks of the technology. There are also less extreme versions than prohibiting technologies. Many parts of modern research practice, such as institutional review boards for animal and human studies, the formalization of broader impacts, and conflict of interest disclosures, developed in response to public concern about unethical behavior and the value of publicly supported research. 

Stakeholder engagement, reaching out to people and institutions who will be impacted by or otherwise have an interest in policies, is a major input into the policymaking process. This can apply to both science for policy and policy for science. In science for policy, stakeholder needs can determine what are practical policy options and some stakeholders will need to accept justification for costly policies. In policy for science, stakeholders typically want to ensure funding support is responsibly used and may request certain deliverables from specific programs. The Consortium for Science, Policy, & Outcomes has developed a handbook on Usable Science, with advice on evaluating the scientific demands of policymakers and effectively structuring research to meet demand. 

There is also a trend of increasing stakeholder engagement with the general public about emerging technologies. This is done to understand societal concerns that scientists should address in their research. Such public engagement was a major part of early government support of nanotechnology and is also seen with forums on biotechnology

Broader engagement is especially important to consider when science policy impacts marginalized groups, such as people of color, LGBTQ people, and people with disabilities. If you only engage with government officials or influential organizations, you may not understand the concerns of groups who don’t wield formal power, or have limited power, in a community or how your policy impacts them. For instance, pollution tends to be concentrated in areas where people of color live and work (1, 2, 3). This is attributed to environmental racism, the effect of racial discrimination in environmental policymaking.  

“Nothing about us without us” is a slogan made popular by disability rights advocates in the late 20th century and is now used in many advocacy movements. It means that groups should be included in decisions that will impact them. In the case of science policy, this can apply to both policymaking and research. In cases of environmental racism, communities of color may be unrepresented /underrepresented in the government institutions that regulate pollution, limiting their power in the legal process. Getting their input may require additional forms of engagement. In research, scientists who aren’t members of a group they are studying may not adequately understand the group’s needs and desires. For instance, Deaf scholars have criticized hearing scientists and engineers for making “sign language gloves” that don’t work with how sign language is used and pose a burden to Deaf users.

Science and Values

Facts alone rarely can make a decision. Consider a simple circumstance of buying a light bulb. You may compare several with an appropriate brightness, but you still have to deal with considerations of energy usage, color temperature, and disposability. For instance, many people disliked early compact fluorescent light bulbs in comparison to older incandescent lights because of the aesthetics of their colors. 

This becomes complicated in significant choices that will affect society. Policy evaluation involves weighing the effects of different proposals, and many policies will involve trade-offs between different things a community values. For instance, there are many ways a state or country could work to reduce its carbon emissions, but some proposals could result in undesired effects without careful planning. For instance, a mass increase in the use of electric vehicles could replace gasoline for transport, reducing carbon emissions. But mining the lithium for batteries can be environmentally damaging. A community may find that trade-off acceptable for various reasons. Alternatively, the state/country could decide most residents owning and using personal vehicles doesn’t make sense for the resources it requires and increase more efficient forms of public transit. This could require changing how cities are planned and change people’s lifestyles. What is and isn’t an acceptable or desirable side effect of policies working towards the same goal is going to be based on what the community values.

Scientific results alone do not suggest what should be valued, or even how such values are constructed or prioritized (a value system). But the results are useful for people to understand and can help them see if a policy does align with their values. If a study finds that a policy fails to meet the goals of its supporters, their values may lead them to adjust it to better work or to mitigate unintended consequences. Sited appropriately, the lithium mine may be in an area that doesn’t harm people or it may not impact important wildlife or ecosystems. Appropriate regulation and enforcement can also reduce these impacts. People may also consider what they value in light of new evidence. A public transit education campaign may encourage people who originally prefer driving to use transit more if they realize it helps meet environmental goals they also value. 

It’s also important to realize that values that scientists may adopt in their own work are not necessarily universal or can conflict with other values. As discussed in the ethics and public input sections, governments and organizations also impose limits on scientific research to prevent immoral acts from being done for the purpose of science. For instance, while it might speed up genetics research if countries had genetic databases of their citizens that researchers could use, this would go against values of privacy and bodily autonomy that are very important to most people. Public input and engagement are critical to understand what values a society considers most important, and how people want research to reflect it.

CGI rendering of a domed building with an opening for a large mirror
Artist depiction of the Thirty Meter Telescope. Courtesy TMT International Observatory.

Limits of Scientific Knowledge

Acknowledgments and Further Reading

  • The Honest Broker, by Roger Pielke Jr., discusses four archetypes of scientific input into policy and politics and discusses the importance of values and uncertainty in science policy
  • The discussion here on the Thirty Meter Telescope is influenced by the work of Chanda Prescod-Weinstein and her Decolonising Science Reading List, and sources were selected from it
  • Related to, but distinct from, “usable science” is the concept of “policy readiness of ideas”. This is a policy analogue to “technology readiness levels“.