Six Approaches to Making Ethical Decisions in Cases of Uncertainty and Risk (2024)


The Principles of Prevention, Precaution, Prudent Vigilance, Polluter Pays, Gambler’s, and Proaction

One of the most difficult times to make ethical decisions is when there is great uncertainty about what the best decision is, or how to go about achieving that best end. Here I will present six contemporary principles, or risk standards, which are approaches for dealing with uncertainty and risk (discussions of some historical approaches can be found here: Probabilism). I will explain each principle and give examples, then discuss some themes.

A key point of connection between risk standards and ethics is that in riskier situations it often makes sense to use more stringent risk standards, and in the riskiest situations, the most stringent risk standards are more likely to be ethically justifiable. These risk standards might be helpfully connected to the Markkula Center’s Framework for Ethical Decision Making when making ethical decisions in uncertain situations.

It is also worth noting that risk tolerance can vary significantly between individuals and between cultures, so it is likely that disagreements will often appear when discussing the ethics of risks. That does not make ethical decision making impossible, it just means that it might be more difficult, and that communication is very important so that all involved groups know and understand what is going on, how, and why.

1) The Prevention Principle takes a highly cautious approach towards ethical decision making because it specifically relates to situations with certainty of negative outcomes. It follows the general rule that “prevention is better than cure,” and therefore harms ought to be anticipated and pre-empted, rather than experienced and solved later (as in the “Polluter Pays Principle”).

This principle is generally uncontroversial in cases where cause and effect are clear and certain; it is when it moves towards uncertainty that more controversy appears, and the Precautionary Principle tends to be invoked instead. [1]

Examples: the Prevention Principle would promote placing safety requirements on automobiles (such as seat belts and airbags), since the certainty of accidents across a population is 100%, and it is better to prevent or reduce injuries rather than cope with them afterwards. Similarly, polluting industries might have requirements that require them to reduce or prevent certain types of pollution, as in using flue-gas desulfurization (sulfur dioxide scrubbers) on coal-fired power plants to prevent acid rain.

2) The Precautionary Principle is an approach to risk management and ethical decision making which seeks to prevent possible harms in cases where there is not yet scientific consensus on connections between cause and effect. The approach merely necessitates that there be a plausible scientific connection, not that it be certain. This approach is more likely to avoid damages, since waiting for the damage to occur (and thus establish a connection) is too late.

This is a more stringent risk standard than the prevention principle due to its acceptance of causal uncertainty. Over time, if causation becomes clearer (thereby decreasing uncertainty), this approach could be shifted towards prevention (if the connection is established), dropped (if the connection is not established), or another approach chosen (if the situation remains complicated). [2]

Examples: the Precautionary Principle is standard for the pharmaceutical approval process in most nations, where new medicines are approved slowly, under careful conditions, so as to avoid widespread social harms. Another example includes the responses of some nations towards genetically modified organisms (GMOs), where safety suspicions delayed deployment until more certainty was established.

3) Prudent Vigilance is an approach to risk which seeks to proceed with the potentially risky behavior while remaining vigilant of risks that might be developing or becoming more certain as one proceeds. It seeks to establish processes for assessing likely benefits and risks before, during, and after an undertaking, and continues “to evaluate safety and security as technologies develop and diffuse into public and private sectors.” [3] Prudent vigilance allows for risk-taking behavior, but with the understanding that ongoing evaluation is necessary. [3, 4]

Examples: Prudent Vigilance was a cornerstone for the United States’ Obama-era Presidential Commission for the Study of Bioethical Issues, in their 2010 report on the ethics of synthetic biology and other emerging technologies. It has remained a principle for discussion and consideration in this field, and has expanded to a few others, including environmental protection and international relations. [5, 6]

4) The Polluter Pays Principle is a risk standard which permits risk-taking behavior and then, if something goes wrong, assigns clean-up for the harms to those who created the harms. [1] This risk standard is responsive rather than anticipatory, and assumes that risk takers will either self-police (and not make errors), or, if self-policing fails, will be capable of making up for the harms they have produced. Ethically, Polluter Pays values freedom and responsibility, and assumes that, for the most part, people lack the power to significantly affect the future, and that those who can affect the future are meticulously careful, honorable, and benevolent.

Because of growing technological power, this principle is now obsolete in many cases, as damages sometimes can be planetary in scale, long term, and irreversible. In cases where it is difficult to hold entities responsible for their actions, or where damage is too much for them to redress, a more anticipatory strategy makes more sense. Additionally, the complexity of society can make it more likely that unscrupulous entities will not be held accountable.

Examples: the Polluter Pays Principle is at work in any situation where it is assumed that harms can be tolerated, and the agents of that harm held accountable for their actions, typically through legal or legislative recourse. Environmental dumping, even on a small scale, such as littering, sometimes shows this principle in action, as the polluter is typically fined for their misdeed.

5) The Gambler’s Principle counsels risk takers to avoid risking damages which, if they occurred, would be ethically unacceptable, ranging up to the largest technological disasters, including global catastrophic and existential risks. Philosophers of technology Hans Jonas and Michael Davis have each advocated this approach, Jonas describing it as forbidding “any ‘va banque’ [“go for broke” or “all in”] game in the affairs of humanity,” [7] and Davis as “don’t bet more than you can afford to lose.” [8]

Davis describes this principle in more detail: “If we (society at its rational best) would reject any plausible benefit in exchange for suffering that harm, we (that part of society making the decision) should, all else equal, rule out any design that risks that harm (however small the probability — so long as it is finite).” [8] Put another way, if a risk can be voluntarily assumed or declined, then for any unacceptable harm, if the probability is non-zero, then the risk is too high, and is therefore unethical and should not be taken. [9, 10]

This risk standard is focused only on the very largest and worst harms, while ignoring more mundane harms. It is anticipatory in nature towards these larger harms, and responsive in nature towards smaller harms. In this way, it can be viewed as more like the Prevention or Precautionary Principles with respect to larger harms and the Polluter Pays Principle with respect to smaller harms.

Examples: the Gambler’s Principle would counsel rejecting the construction of a nuclear power plant, if a meltdown and subsequent radioactive pollution were deemed an unacceptable risk. Another might be the development of self-replicating nanotechnology, which could bring great benefits, but risks consuming the world if weaponized or gone out of control. In other cases, such as car accidents or more “average” harms, this principle permits the risky behavior and a reactive response if necessary, or it defers to another risk standard.

6) The Proactionary Principle is an approach to risk taking behavior which argues that innovation and technological progress should be pursued with speed. [11] It characterizes the current risk conditions as unacceptably bad (i.e. unethical), and therefore argues that other risks ought to be taken in order to escape the current risky state. It is an approach to risk which emphasizes action now, even in the face of possible negative effects, because if actions are not taken now, then the current unacceptable state will continue, and the future itself may be at stake.

It is optimistic in assuming that the future will be better, despite the risks taken to get there (and any possible ongoing harms from those risks), and is pessimistic about the current state of the world. The Proactionary Principle places faith in the benefits of technological progress. It does not cope well with the most disastrous and irreversible risks of technology, such as existential risks.

Examples: the Proactionary Principle is visible anytime a risk is deemed to be worth the reward, e.g. when taking a new job, buying a house, starting a business, etc. With respect to technological development, it could be used to promote certain technologies such as radical life extension, space settlement, peace-building technologies, and environmental sustainability technologies, arguing that those technologies ought to be developed as quickly as possible, because our current situation is quite dire. Historically, the Manhattan Project followed the Proactionary Principle due to fear of Nazi Germany obtaining the atomic bomb first, and in this effort was pushed forward even as significant scientists worried that it risked igniting the Earth’s atmosphere and destroying all life. [12, 13]

Discussion

There are several ethical dimensions at play in these principles. A first is whether they are anticipatory of harms or reactive/responsive to harms. In the past, permitting harms, then reacting to them, was considered to be acceptable in many cases, since harms were often less damaging.

As a second related dimension, there is the question of whether entities can be trusted to make amends for their damages after the fact, or whether they are likely to shirk their responsibilities and go unpunished, thus contributing to social degradation and breakdown of trust. The more likely it is for damages to go unpunished and/or unredressed, the more important it is to prevent them. Given the complex interactions of entities across the globe and over time, and the rise of uncertain causal connections, lack of accountability has increased and is likely to continue to do so.

Relatedly, a third dimension is the magnitude of the harms a stake. As technology has expanded the human capacity for disaster, more need of anticipation and pre-emption has emerged. Irreversible harms such as species extinctions, and harms of massive scale both spatially and temporally, such as climate change, have necessitated new ways of looking at the ethics of risk.

A fourth dimension is the probability or uncertainty of the risk. As technology has expanded human power, it has also increased our scope of action in unpredictable ways, and therefore uncertainty about the effects of our choices has increased. Every new technology deployed is something like a socio-environmental experiment, exploring the world for effects, both anticipated and unanticipated. In this environment of enhanced uncertainty, risk is much harder to calculate, uncertainty much higher, and therefore risk ought to be avoided more carefully.

Combining some of these dimensions is possible through the “Risk Equation,” often written as Risk = Probability x Harm, or R = p(L), where “R” is risk, “p” is probability, and “L” is loss or harm. The Risk Equation informs several of the above principles and can be a useful interpretative framework for conceptualizing how some aspects of these principles relate to each other.

Lastly, these principles are not presented with the intent of advocating any particular one. Each has its uses, depending on the circ*mstances. However, it is worth noting that as human impact on the world has increased in past decades (due to technological harms increasing as well as overall uncertainty), societal risk tolerances could have understandably reacted. It may seem that there has been an overall shift towards more risk-averse approaches.

However, perceived in another way, it is merely that the world has changed, while societal risk tolerances have remained even, and these social preferences have gradually expressed a reaction to the shift in power in the techno-social environment. In other words, it is risk that has increased, not risk aversion. In a world where there are more dangerous choices, there is more to say “no” to, [14] and a greater role for ethics as well.

References

[1] World Commission on the Ethics of Scientific Knowledge and Technology (COMEST), “The Precautionary Principle” (Paris: United Nations Educational, Scientific and Cultural Organization (UNESCO), 2005) 7-8. Available at: https://unesdoc.unesco.org/ark:/48223/pf0000139578

[2] “Precautionary Principle,” Glossary of Summaries, EUR-Lex: Access to European Union Law, website, accessed July 6, 2016. Available at: http://eur-lex.europa.eu/summary/glossary/precautionary_principle.html

[3] Presidential Commission for the Study of Bioethical Issues, “New Directions: Ethics of Synthetic Biology and Emerging Technologies,” Washington, D.C, December 2010, p. 27, 123. Available at: http://bioethics.gov/sites/default/files/PCSBI-Synthetic-Biology-Report-12.16.10_0.pdf

[4] Amy Gutman, “The Ethics of Synthetic Biology: Guiding Principles for Emerging Technologies,” The Hastings Center Report (July-August 2011): 17-22. Available at: https://onlinelibrary.wiley.com/doi/pdf/10.1002/j.1552-146X.2011.tb00118.x

[5] Alison McLennan, “Chapter 5: Environmental risk: uncertainty, precaution, prudent vigilance and adaptation,” in Regulation of Synthetic Biology: BioBricks, Biopunks and Bioentrepreneurs, Elgar Studies in Law and Regulation, by Alison McLennan (Cheltenham, UK: Edward Elgar Publishing, 2018). Precis available at Elgar Online: https://www.elgaronline.com/abstract/9781785369438/14_chapter5.xhtml?

[6] Keir Giles, “Russia Hit Multiple Targets with Zapad-2017,” U.S.-Russia Insight, Carnegie Endowment for International Peace, January 2018. Available at: https://carnegieendowment.org/files/Giles_Zapad_web.pdf

[7] Hans Jonas, The Imperative of Responsibility, (Chicago: University of Chicago Press, 1984) 38.

[8] Michael Davis, “Three nuclear disasters and a hurricane,” Journal of Applied Ethics and Philosophy 4 (August 2012) 8. Available at: https://eprints.lib.hokudai.ac.jp/dspace/bitstream/2115/50468/1/jaep4-1_micael%20davis.pdf

[9] Brian Patrick Green, “Transhumanism and Roman Catholicism: Imagined and Real Tensions,” Theology and Science 13:2 (2015): 196.

[10] Brian Patrick Green, “Little Prevention, Less Cure: Synthetic Biology, Existential Risk, and Ethics,” Workshop on Research Agendas in the Societal Aspects of Synthetic Biology, Tempe, Arizona, November 4-6, 2014. Available at: https://cns.asu.edu/sites/default/files/greenp_synbiopaper_2014.pdf

[11] Max More, “The Proactionary Principle, Version 1.0,” Extropy.org, 2004. Available at: http://www.extropy.org/proactionaryprinciple.htm

[12] Emil Konopinski, Cloyd Margin, and Edward Teller “Ignition of the Atmosphere with Nuclear Bombs,” Classified US Government Report (declassified 1979), August 14, 1946. Available at: https://fas.org/sgp/othergov/doe/lanl/docs1/00329010.pdf

[13] Daniel Ellsberg, “Risking Doomsday I: Atmospheric Ignition,” in Daniel Ellsberg The Doomsday Machine: Confessions of a Nuclear War Planner, (New York: Bloomsbury, 2017) pp. 274-85.

[14] Brian Patrick Green, “The Catholic Church and Technological Progress: Past, Present, and Future.” Religions, special issue guest edited by Noreen Herzfeld, 1 June 2017, 8(106): 12. Available at: http://www.mdpi.com/2077-1444/8/6/106/htm

Six Approaches to Making Ethical Decisions in Cases of Uncertainty and Risk (2024)

FAQs

Six Approaches to Making Ethical Decisions in Cases of Uncertainty and Risk? ›

The Principles of Prevention, Precaution, Prudent Vigilance, Polluter Pays, Gambler's, and Proaction.

What are the 6 approaches to ethical decision-making? ›

Their framework for Ethical Decision making includes: Recognize the Ethical Issue, Get the Facts, Evaluate Alternative Actions, Make a Decision and Test it, Act and Reflect on the Outcome.

What are the six steps of making an ethical decision? ›

Ethical Decision Making Process
  • Step One: Define the Problem. ...
  • Step Two: Seek Out Resources. ...
  • Step Three: Brainstorm a List of Potential Solutions. ...
  • Step Four: Evaluate Those Alternatives. ...
  • Step Five: Make Your Decision, and Implement It. ...
  • Step Six: Evaluate Your Decision.
Aug 21, 2018

What are the six decision rules that people use to make ethical decisions? ›

The next time you are faced with an ethical dilemma, use this six-step process to come to the right solution.
  • Define. ...
  • Identify internal and external factors. ...
  • Identify key values. ...
  • Identify the audiences that will be affected by your decision. ...
  • Select the ethical principles to guide you. ...
  • Make the decision and justify.
Sep 15, 2020

What is the 6 step process for ethical problem solving? ›

Understanding Ethics
  • Know the Facts. Before tackling an ethical issue, clearly define the nature of the challenge. ...
  • Identify the Required Information. You don't know what you don't know. ...
  • List the Concerns. ...
  • Develop Possible Resolutions. ...
  • Evaluate the Resolutions. ...
  • Recommend an Action.
Jun 16, 2023

What are the six ethical? ›

The principles are beneficence, non-maleficence, autonomy, justice; truth-telling and promise-keeping.

What are the approaches to ethical decision-making? ›

The four approaches are: The principle approach, in which decisions are made according to a principle such as the Ten Commandments or the Golden Rule The consequence approach, in which decisions are made according to their likely outcomes The virtue/character approach, in which decisions are made according to the ...

What are the 6 types of categories for ethical dilemmas? ›

And only some types, if any, may constitute genuine ethical dilemmas.
  • Epistemic vs ontological.
  • Self-imposed vs world-imposed.
  • Obligation vs prohibition.
  • Single-agent vs multi-agent.
  • Other types.

What are the 7 principles of ethical decision-making? ›

In summary, integrity, respect, responsibility, fairness, compassion, courage, and wisdom are the seven principles of ethical decision-making.

What is the first step in a six-step process for improving business ethics? ›

1. Get board buy-in. The first step lies in gaining support from organisational leadership; admittedly, no easy task. “This step is the most important,” says Cherepanova.

Which of the following is the first of the six recommended steps in solving an ethical dilemma? ›

1. DETERMINE whether there is an ethical issue or/and dilemma. Is there a conflict of values, or rights, or professional responsibilities? (For example, there may be an issue of self-determination of an adolescent versus the well-being of the family.) 2.

What are the 7 ethical decision-making processes? ›

A 7-STep Guide to Ethical Decision-Making
  • State the problem. ...
  • Check the facts. ...
  • Identify relevant factors (internal and external).
  • Develop a list of options. ...
  • Test the options. ...
  • Make a choice based on steps 1-5.
  • Review steps 1-6.
Feb 29, 2024

What are the 7 ethical decision-making models? ›

The 7 steps are: 1) Gather the facts, 2) Define the ethical issue(s), 3) Identify the affected parties, 4) Identify the consequences, 5) Identify relevant principles, rights and justice issues, 6) Consider your character and integrity, and 7) Monitor and modify.

Top Articles
Latest Posts
Article information

Author: Foster Heidenreich CPA

Last Updated:

Views: 5591

Rating: 4.6 / 5 (56 voted)

Reviews: 87% of readers found this page helpful

Author information

Name: Foster Heidenreich CPA

Birthday: 1995-01-14

Address: 55021 Usha Garden, North Larisa, DE 19209

Phone: +6812240846623

Job: Corporate Healthcare Strategist

Hobby: Singing, Listening to music, Rafting, LARPing, Gardening, Quilting, Rappelling

Introduction: My name is Foster Heidenreich CPA, I am a delightful, quaint, glorious, quaint, faithful, enchanting, fine person who loves writing and wants to share my knowledge and understanding with you.