- Systematic Review
- Open access
- Published:
Evaluating cognitive bias in clinical ethics supports: a scoping review
BMC Medical Ethics volume 26, Article number: 16 (2025)
Abstract
Background
A variety of cognitive biases are known to compromise ethical deliberation and decision-making processes. However, little is known about their role in clinical ethics supports (CES).
Methods
We searched five electronic databases (Pubmed, PsychINFO, the Web of Science, CINAHL, and Medline) to identify articles describing cognitive bias in the context of committees that deliberate on ethical issues concerning patients, at all levels of care. We charted the data from the retrieved articles including the authors and year of publication, title, CES reference, the reported cognitive bias, paper type, and approach.
Results
Of an initial 572 records retrieved, we screened the titles and abstracts of 128 articles, and identified 58 articles for full review. Four articles were selected for inclusion. Two are empirical investigations of bias in two CES, and two are theoretical, conceptual papers that discuss cognitive bias during CES deliberations. Our main result first shows an overview of bias related to the working human environment and to information gathering that concerns different types of CES. Second, several determinants of cognitive bias were highlighted. Especially, stressful environments could be at risk of cognitive bias, whatever the clinical dilemma.
Conclusions
Whether a need for a better taxonomy of cognitive bias in CES is highlighted, a proposal is made to focus on individual, group, institutional and professional biases that can be present during clinical ethics deliberation. However, future studies need to focus on an ecological evaluation of CES deliberations, in order to better-characterize cognitive biases and to study how they impact the quality of ethical decision-making. This information would be useful in considering countermeasures to ensure that deliberation is as unbiased as possible, and allow the most appropriate ethical decision to emerge in response to the dilemma at hand.
Introduction
Various ethical dilemmas can arise not only in the context of day-to-day healthcare, but also in acute or crisis situations such as the COVID-19 pandemic. Issues relate to a wide variety of areas, ranging from prenatal care to end-of-life care [1]. In general, ethical dilemmas are characterized by a choice between two, or a few, mutually-exclusive options, none of which is satisfactory, because there are undesirable moral consequences [2]. Typically, nurses, physicians, or other members of the care team do not know what to do for the best, and need counseling regarding the optimal decision [3]. One study highlights that the most-often-reported ethical difficulties are linked to uncertainty regarding impaired decision-making capacity of a patient, disagreement among caregivers, and limiting end-of-life treatment [4]. For nurses, it has been proposed that an ethical dilemma is characterized by the interaction between three central concepts: the nurse (‘self’), the patient (‘other’), and health (the ‘good’). In this situation, the dynamic balance of the nurse-patient interaction becomes blurred, and a choice must be made between equally-valid ethical outcomes or ideals [5]. Overall, most dilemmas relate to balancing care quality and efficiency, allocating limited medications or support tools, end-of-life, access to care, caregiver and patient confidentiality, and allocating limited donor organs. More precisely, Moeller et al. [6] found that most ethical dilemmas fall into one of the following general categories: conflict about withholding or withdrawing treatment; futility issues; the decisional capacity of the patient; unknown wishes of the patient; non-compliant patients; and Do Not Resuscitate (DNR) orders [6].
In each of these situations, healthcare personnel can be uncomfortable or uncertain regarding what is right or best for the patient, which can lead to disagreements about what should be done [7]. This experience is known to not only be a risk for the health of medical staff, but can also lead to feelings of tension and frustration in clinical practice, be a source of professional dissatisfaction, and compromise social relationships, including interactions with patients [8, 9]. Together, these feelings create moral distress, which often leads, over time, to staff resigning from the unit or organization, and/ or leaving the profession. First coined by Andrew Jameton [9], moral distress refers to the stress caused by the intention to pursue a morally-preferred action, but being unable to do so because of institutional barriers. It has been described in all healthcare professionals, including physicians [10], nurses [7,8,9], pharmacists [11], and social workers [12].
Addressing difficult ethical issues in an appropriate way is a key challenge in everyday patient care. As a result, recent decades have seen the increasing development of ethical awareness in the health sector, which has resulted in the creation of clinical ethics supports (CES). The latter provide a forum for structured deliberation, and assist in reflections on ethical dilemmas in inpatient healthcare settings. There are several types, which can be divided into two main groups, although this categorization may vary from country to country [1]. The first includes clinical ethics consultations and clinical ethics committees. These bodies traditionally provide healthcare personnel in clinical practice with advice and recommendations regarding the best course of action [13]. Participants are trained in the systematic analysis and structured discussion of ethical dilemmas, together with all stakeholders [14]. The aim is to promote openness about value judgments, and be able to justify decisions taken in day-to-day clinical practice.
The second group consists of moral case deliberation, ethics rounds, ethics discussion groups, and ethics reflection groups. Here, the goal is to encourage healthcare personnel to broaden their perspectives through reflection, thereby increasing insight into ethical issues. Moral case deliberation, in particular, is a specific kind of ethics support. It starts with the concrete experience of participants, and presupposes that good care cannot be determined in advance, based on theoretical principles or theories. Rather, good care emerges from a dialog (deliberation), in which participants examine and share their views, based on earlier experiences [15]. Such a form of reflection has been found to have psychological benefits for personnel working in healthcare settings, as it creates an atmosphere in which they feel free to express feelings and emotions related to cases they are struggling with [16]. Irrespective of categorization, clinical ethics consultations and moral case deliberation constitute two modalities of ethical decision support that can overlap, notably through the use of ethical deliberation methodology.
In the domain of clinical ethics, there are three, main complementary theories: (i) consequentialism focuses on the ethical consequences of an action; (ii) deontology considers that ethical actions mean doing one’s duty; and (iii) virtue ethics considers that ethics are a matter of cultivating appropriate virtues [17, 18]. Finally, the principism approach, which is notably based on the primacy of beneficence, and care-based ethics, offers a summarize of three fundamental theories with the aim of being clear and easy when clinical decisions need to be made in practice [19]. All of these approaches offer a framework for deliberation.
Ethical deliberation aims to put in place conditions that favor contradictory debate implying critical dialogue process of collaborative communication between people to foster shared understanding of an ethical problem. Such approaches assume that parties have different, often opposing, positions, but also that they are likely to share some common ground, or at least complementary views from which solutions may emerge. The conditions to favor contradictory debate include: (i) holding dedicated meetings; (ii) involving experts in the field and external third parties; and (iii) adhering to a form of moral contractualism (Billier, 2014). It is a formal framework that aims to ensure that rational decisions are taken when faced with the emotion that can be associated with a clinical dilemma [16,17,18,19,20]. The aim is to allow both a consensus and a solution (such as advice) to emerge [21]. The solution presupposes a sharp distinction between the rational analysis (based on knowledge of ethical concepts, principles, and theories) and communicative processes (such as the emotional response to actual clinical cases), which limit rationality.
Various cognitive and affective biases are known to compromise, in some cases, both deliberation and decision-making processes. In general, a bias is usually the result of prejudice when choosing one thing over another. Biases can be influenced by experience, judgment, social norms, assumptions, academics, and more. Affective biases typically occur spontaneously based on the personal feelings of an individual at the time a decision is made. Affective biases are usually not based on expansive conceptual reasoning. Cognitive biases generally involve decision-making based on established concepts that may or may not be accurate. Both cognitive and affective biases may or may not prove to be successful when influencing a decision. While the rationality that is the foundation of CES has been considered in relation to the emotional state that arises when the question is important to the participant, less is known about the impact of cognitive bias. Cognitive bias refers to systematic cognitive distortion that is inherent to human cognition [22]. These systematic cognitive processes are misleading, and false logic distorts information processing. Also known as “mental contaminations”, unconscious/ uncontrollable mental processes can drive unwanted responses [23]. This can impair judgement, especially when there is a large amount of information, or when time is limited.
According to dual process theory, two competing processes participate in human cognition: Type 1 (T1) are fast, automatic, and affect-driven; and Type 2 (T2) are slow, deliberative and underlie higher-order thinking [24,25,26]. T2 processes are characterized by their heavy load on working memory, explicitness, high effort, and slowness. On the other hand, T1 processes are characterized by their low load on working memory, implicitness, low effort, and speed. They are fast and efficient, enabling us to respond almost instantly to many situations, especially those that threaten survival [26]. They are reasonably accurate and effective, with the aim of saving mental energy for times when deep thought is required. However, they rely on generalities and are error-prone, which is considered to favor the emergence of cognitive biases [26]. To date, over 100 cognitive biases have been described in the general literature, and at least 38 in the medical literature, with a focus on diagnostic reasoning and therapeutic choices [27,28,29]. Although analytic thinking is considered as a mindful counterpoint approach to intuitive thinking, errors due to cognitive bias appear to be explained by both T1 and T2 systems [29]. Interestingly, the literature suggests that increasing expertise (and knowledge) may decrease the likelihood of errors [19,20,21,22,23,24,25,26,27,28,29,30].
Since Tversky and Kahneman’s seminal 1974 paper [22], the notion of cognitive bias has been explored in various fields, including both clinical reasoning and medical practice [27,28,29, 31]. Clinical ethical practices have received less attention, despite increasing interest in cognitive bias in ethical reasoning. Ethical reasoning is difficult and complex. It requires soft skills (such as communication, the ability to deliberate, self-reflection), specific knowledge (about ethic and bioethics, moral philosophy, clinical knowledge, and medical legislation), and experience of how similar cases have been resolved. Cognitive biases have been little-studied in the context of CES, and we do not know whether they affect CES members, who are experts in the field of clinical reasoning.
A recent review, focused on bioethics, has highlighted several potentially-relevant biases, notably cognitive biases, affective biases, imperatives, and moral biases [32]. The latter author stresses the importance of identifying and addressing biases, as a way to help in assessing and improving the quality of bioethics work. Three types of bias are described according to their relation to the cognitive processes. These biases disrupt the perception of reality to make it fit into the subject’s frames, or make it conform to his/her preconceived ideas. They are: confirmation bias; action bias (e.g., overestimation, planner’s bias, excessive optimism), and status quo bias. The latter is the most striking, and it refers to a preference for the maintenance of the current/ previous state of affairs, leading to a failure to take action that would change this state.
Furthermore, we can ask how human rationality impacts deliberation in the context of clinical ethics. To the best of our knowledge, the impact of cognitive bias on ethical decision-making (EDM) has not yet been studied. Theoretical EDM models often conflict with each other, and can be divided into: (a) rationalist-based (i.e., reason; [33]); and (b) non-rationalist-based (i.e., intuition and emotion; [34]). Consistent with this line of thinking, the dual process theory of human cognition [26,27,28,29,30,31,32,33,34,35] considers that moral intuition is an automatic response, and an antecedent to rational reasoning [36, 37]. Emotive intuition is quick and effortless, while cognitive reasoning is slow and requires effort [36]. Although there are many EDM models [38], which have been applied in various clinical fields [38, 39], to the best of our knowledge, existing approaches do not take into account shared decision-making. Shared decision-making refers to the interpersonal, interdependent process in which physicians, patients, and their caregivers relate to, and influence each other, as they collaborate to make decisions about a patient’s healthcare [40]. Although shared decision-making can occur in different ethical decision-making models, like EDM, little is known about the impact of cognitive bias during this process.
Thus, the aim of this scoping review is to study the literature on cognitive bias in clinical ethic deliberation. The objectives are to evaluate: (i) the biases that have been highlighted in the context of clinical ethics; and (ii) the contexts in which they are most likely to appear. These objectives can help to better-categorize biases in ethical processes related to clinical decision-making, and identify debiasing actions.
Method
For guiding selection process and for conducting the scoping review, we used the Preferred Reporting Items for Systematic Reviews and Meta-Analyses extension for Scoping Reviews (PRISMA-ScR [41] and scoping review methodology guidances [42,43,44].
Search strategy
Key databases included Pubmed, PsychINFO, the Web of Science, CINAHL (the Cumulative Index to Nursing and Allied Health Literature), and Medline. The first step was to identify indexed search terms, based on CINAHL headings, Mesh and Thesaurus. A list of possible terms was created. The primary search was supplemented by a manual search of relevant journals, a search of the abstracts of citations listed in accepted articles, and by contacting lead authors in the field (to identify articles that may not have been indexed in the above databases). This additional search was deemed necessary because the literature is often widely-dispersed, and relevant studies might not be indexed in all databases [45]. An experienced university librarian, and another from the University Hospital of Clermont-Ferrand (France) independently assisted the first author. Manual searches of the above-mentioned databases did not result in the identification of additional articles. The primary search strategy was inclusive, to avoid excluding potentially-relevant articles. The following search strings were used in combination.
-
String 1 was used to capture articles describing the many different types of clinical ethics committees: ethical professionals, clinical ethics support, clinical ethics committees, ethics case reflection, ethics rounds and ethics reflection groups, healthcare settings, ethical consultation, clinical ethics consultation, ethics discussion groups, ethics reflection groups, clinical ethics consultation, moral case deliberation, ethics rounds, healthcare workers, clinical ethical challenge.
-
String 2 was used to capture phenomena defined as cognitive biases, and the errors that are likely to occur in this field: cognitive bias, explicit bias, implicit bias, cognitive distortion, cognitive errors.
-
String 3 was used to capture the context of ethical deliberation: moral competence, moral teamwork and moral action, moral case deliberations, moral case deliberation outcome, ethical decisions, ethical decision making, ethical decision-making process, ethical behavior, ethical reasoning, ethical dilemmas, ethical decision models, ethical issue.
Our aim was to explore all of the various types of clinical ethics committees that deliberate on ethical issues raised by patients, at all levels of care. The acronym CES was used when it was not useful to specify the type of clinical ethics support. We therefore incorporated a broad range of terms in strings one and three, in an attempt to capture the diversity of activities and approaches that might be used in the context of clinical ethics.
Inclusion/exclusion criteria
As specified in the systematic review method [45,46,47], several criteria were used to select studies suitable for inclusion. These criteria included: (a) published in full, in the English language; (b) based on empirical data published in a peer reviewed journal (i.e., excluding magazine articles and book chapters); (c) published between January 2000 and December 2023; (d) describe a CES [1]; (e) relate specifically to some form of cognitive bias; and (f) the substantive content must relate specifically to a cognitive bias in the context of a CES. The aim was to ensure that only the most relevant, up-to-date, and reliable sources were included.
The time period (January 2000 to December 2023) was chosen in order to obtain a recent, representative view of advances since the development of CES in different countries. Studies were included based on the nature of the ethical deliberation (clinical and not research committees), and the need to evaluate the impact of the cognitive bias environment on clinical behaviors or issues.
A study was excluded if: (1) biases are not studied within the specific framework of a CES (e.g., bioethic committee, decision-making processes in clinical interactions, psychotherapeutic interactions), (2) biases are not studied as cognitive biases (e.g., cognitive distortion, racial/ethnic bias), (3) The CES is not considered as a forum for structured deliberation, and assist in reflections on ethical dilemmas in inpatient healthcare settings (e.g., formation), (4) it was not written in English, (5) the full-text was not available, or (6) it examined cognitive bias in CES before January 2000 or after December 2023.
Study screening and selection
The combined search yielded an initial corpus of 572 articles. Of these, 128 papers had at least one exclusion criterion and 58 were deemed relevant for further review. We imported the 58 retrieved entries into an excel table. MT identified and excluded duplicates and conducted an initial screening by title, abstract, and full content review according to the research question, and the MIP (Methodology, Issue, Participants) process. This was followed by a second screening by LG, AL, FGF in which only articles from, reporting on, or referencing, CES and cognitive bias were included. Where there were disagreements, it was decided that the experienced university librarian (NP) provides adjudication. Any disagreements were resolved through discussion. Finally, four articles were considered suitable for inclusion in the review (Fig. 1).
Data extraction, analysis, and synthesis
The included studies were read several times, in full, by the first author, in order to become familiar with the data. In the next step, we extracted and summarized their findings [45,46,47].
After several readings of the included studies, we decided that an approach inspired by a manifest content analysis [48, 49] would be most appropriate for extracting relevant themes. The corpus was composed of two empirical (qualitative and quantitative) articles, and two theoretical papers. A content analysis was selected because it could be used to parse the four studies’ findings according to cognitive biases in the CES context. In contrast, a directed content analysis uses existing concepts or theoretical frameworks (e.g., types of cognitive bias) to focus the research question, and forms the foundation for the categorization of the bias. The characteristics of the studies included in this review are presented above (Table 1). Due to the heterogeneous nature of the methods used, and the way findings were reported, we decided to carry out a narrative interpretive/ integrative synthesis of the outcomes of the four studies [50].
Results
Drawing upon the corpus of articles, we noted the different contexts in terms of CES, and in terms of patients and ethical dilemmas. Two contexts focused on incapacitated patients. Most findings related to bias fell into the domain of individual biases, biases related to the professional environment, or biases implied in the function of the CES. Accordingly, our findings are grouped under the following three headings: methods applied to different CES, types of patients and clinical dilemmas, and types of cognitive bias.
Methods applied to different CES
The four included studies differ in terms of method. On the one hand, the theoretical approach taken by Magelssen et al. [52] and Schleger et al. [53] is based on an analysis of experience and interdisciplinary competencies. The latter include CES, psychology, nursing, health quality, and cognitive biases. In particular, Malgessen et al. [52] mainly draw upon examples taken from clinical ethics consultants, and the experience of committees in Norway. On the other hand, a mixed methods approach is applied in the two experimental studies, namely Blackstone et al. [51] and Stanack and Hawlik [54]. Blackstone et al. [51] draw their conclusions from three examples, and 12 interviews with ‘incapacitated patients without proxies’ (PWPs). Stanak and Hawlik [54] investigate neonatal intensive care, and target extremely pre-term situations. Their systematic literature review examines the shared decision-making context, and the researchers carried out in-depth, semi-structured interviews with five heads of neonatology departments, and one clinical ethicist.
Both theoretical studies take into account CES such as informal and routine rounds, when an ethical problem arises, and full ethics consultations. No information is given regarding who participated, either in terms of ethics training or experience. Stanack and Hawlik [54] reviewed 80 publications relevant to the theme of bias in neonatal decision-making that discuss the role of parents, physicians, an ethical council, or an ethical committee. The two experimental studies examined local CES dedicated to specific ethical dilemmas, which had been setup to decide how to best-respond.
Blackstone et al. [51] describe a protocol that has been in place in their institution since 2005, which is implemented when medical teams identify an incapacitated PWP who requires imminent, but non-emergent medical decisions. First, the ethics consultant works with the medical team and a social worker to conduct an aggressive search for a surrogate, or someone who knows the patient, who can provide information regarding what treatment the patient might want. If no-one is found, the consultant contacts the PWP’s committee members to recruit individuals who will coordinate with the medical team. The PWP committee is composed of community volunteers. These people are intentionally not medical practitioners, and, in order to minimize conflict of interest, they cannot be hospital employees. Although no educational requirements are specified, most volunteers have a background in bioethics, in addition to their experience on the ethics committee. In the studied population, all but one held a postgraduate degree, and all had clinical experience, from varied perspectives. Furthermore, at the time the interviews were run, their experience with PWP cases ranged from two to 18 cases, over one to 10 years of participation in the committee.
During the meeting, the medical team presents the clinical facts, the available treatment options, and their recommendations. The ethics consultant and the social worker provide any information they have gathered about the patient’s background. If available, friends share information regarding the patient’s personality, values, and lifestyle, and give input about what they think the patient might have wanted in this situation. PWP committee members synthesize this information, discuss it, and make recommendations that are thought to be in the patient’s best interest, and consistent with his or her values.
Stanack and Hawlik [54] investigate professional stakeholders working in neonatal units. They interviewed five heads of neonatal departments, and a clinical ethicist working in a Neonatal Intensive Care Unit. They analyzed communication strategies with parents, and the possible impacts on survival and neurodevelopmental outcomes. All participants had previous experience of neonatal decision-making situations, but no details were given regarding their training in medical ethics. Parents were not interviewed.
Overall, we found that different methodologies are used, the studied CES are diverse, their members may (or may not) be exclusively caregivers, and participants’ training and experience varies greatly from one study to another.
Types of patients and clinical dilemmas
The two theoretical studies do not target a specific clinical dilemma; rather they consider clinical dilemmas and cognitive biases in general. However, the two experimental studies focus on a situation in which the clinical dilemma is due to a lack of information about the patient’s preferences when a medical decision should be made.
The study by Blackstone et al. [51] concerns incapacitated PWPs in acute, but non-emergent situations. A medical choice must be made regarding incapacitated patients who lack a proxy. Such clinical dilemmas are challenging in clinical units, where patients require complex care, and decisions must be taken that set the course for treatment.
One of the difficulties in this context relates to the legal framework, which specifies who has the authority to make medical decisions on behalf of PWPs, and what types of decisions they can make. While consent can be assumed in an emergency, many non-emergent situations require decisions to be taken more quickly than in the weeks (to months) it would take to appoint a guardian. Committee members must be able to cope with the doubt and ambivalence that is inherent in the decisions they take. Current legislation does not address the case of PWPs, and this situation has led individual institutions to develop their own procedures when medical decisions must be taken for acutely ill patients. While it is reasonable to assume that medical teams make decisions that are in the PWP’s best interest, the committee usually has an incomplete picture of who the patient is, and what he or she would have wanted. This can lead to a certain degree of projection of committee member’ values onto patients.
Not only are protocols poorly disseminated, but also their outcomes are understudied. The problem is clearly illustrated in the United States, where state legislation varies widely regarding who has the authority to make medical decisions for incapacitated PWPs, and what types of decisions they can make.
Stanack and Hawlik [54] explore the shared decision-making context for extremely pre-term infants (22–25 weeks of gestation). The precise determination of the number of weeks of gestation is important, as the probability of survival, and survival without neurodevelopmental impairment increases significantly between weeks 22 to 25. This determination creates an ethical dilemma that consists of avoiding creating an unnecessary burden on the infant and the family, on the one hand, and giving the infant a realistic chance of survival, on the other hand. How neonatologists communicate the available options or choices to parents can be an ethical challenge, as this discussion is known to have an impact on what parents decide, and hence the survival or neurodevelopment of the fetus. However, current guidelines do not address communication strategies with parents either pre- or post-delivery [54], and outcomes differ between institutions and countries [55]. Finally, our thematic analysis highlights that clinical decision-making involves three main ethical challenges: (i) social, cultural-religious, and legal contexts; (ii) uncertainty about the number of weeks of gestation; and (iii) difficult decisions about intensive or comfort care that are in the best interest of the infant and its parents, and address moral distress and professional virtues [54].
Finally, all authors highlight that clinical dilemmas are complex, which creates conditions that favor the risk of bias, and underline the importance of time pressure. Furthermore, they highlight the problem of a lack of information about the patient’s wishes [51, 53], the criticality or volatility of his or her state of health [51], and viability [56]. These characteristics foster volatility, uncertainty, complexity and ambiguity—an environment that increases the risk of bias for CES. Such situations are called VUCA environments (Volatility, Uncertainty, Complexity and Ambiguity environments) in military and management contexts; they are known to challenge cognitive adaptation capacities, because of their deleterious impacts on decision-making [56].
Types of cognitive bias
Each of the four papers identifies different biases, providing examples and placing them within theoretical frameworks. We divided these biases into two main categories. The first refers to working human environment in terms of possible bias for stakeholder, and/or member of CES and/or CES. These biases can be observed at the individual level or relational or group levels. The second refers to an information-gathering through to decision-making bias, and targets different stages of information processing, including committee deliberation. Magelssen et al. [52] describe two activities that are relevant to CES objectives. The first is to provide an analysis of morally-relevant features of the clinical dilemma (including values, facts, interests, legislation, alternative courses of action and their consequences). The second is to provide appropriate moral advice to decision-makers (if necessary). Such an approach leads to the conditions at risk for the emergence of bias. A first condition refers to at-risk-working environments. On the one hand, the structure/ hospital management may provide a working environment that favors cognitive biaises, including the introduction of biases when hiring individual ethics consultants. These biases include the Hawthorne effect, conformism, and authority bias. Magelssen et al. [52] even suggest that they can create conflicts of interest as individual ethics consultants will not only be influenced by their actual values and desires, but also by what those individuals perceive to be desirable within their working environment. These biases which characterized a type of response bias that is the tendency of survey respondents to answer questions in a manner that will be viewed favorably by others and that are considered to be in relation with the social desirability bias. For example, closely related to social desirability bias, conformity bias occurs when social pressure influences an individual's decisions and judgments. In order not to feel excluded, a person will consciously, or unconsciously, join the dominant opinion or act like the majority of the group. On the other hand, bias can be introduced through an excessive focus on current legislation and regulation, to the detriment of ethical concerns. Such an imbalance degrades the CES’s role as a critic when the healthcare provider confuses the minimal requirements of a law or regulation as the ethical ceiling. In line, a medical act that is illegal could, nevertheless, be morally acceptable, and a better decision for a patient. Magelssen et al. [52] also suggested that such a bias could be compared to the bias that may arise when an individual member of a CES relies on ideological or religious reasoning.
Magelssen et al. [52] also highlight the myriad of competing moral perspectives and conditions that can provide a breeding ground for bias within the CES. The CES’s discussions may be judged to be biased towards one or more moral theories or perspectives that are inadequate or misleading. Magelssen et al. [52] illustrated this situation with the moral theory, called ‘principism’ developed by Beauchamp and Childress [19]. The latter could favor certain forms of deontology, or what is understood as common morality. In this situation, confirmation bias can be introduced, either through reliance on arguments from an erroneous or inadequate moral theory or perspective, or through the rationalization of a preferred conclusion by appealing to arguments that support it.
Secondly, there is a risk of authority bias linked to the CES’s stakeholders – patients, committee members, and clinicians alike. As for the environmental works, the clinical culture, and the interests of healthcare professionals can favor, social desirability bias per se. Finally, it should be noted that Magelssen et al. [52] do not refer to biases that are problematic for improving mutual understanding between stakeholders; they also make no recommendations regarding strategies that could be used to resolve conflicts.
Schleger et al. [53] distinguish between group and individual biases. They illustrate systematic inadequacies using group and individual decisions, and highlight their application to clinical ethical decisions. Group bias focuses on hidden profile bias, which refers to the tendency of groups to concentrate on what the majority of its members know [57], this is also termed ‘process loss’ in social psychology. Instead of sharing all of the available information, individual group members only share a fragment of it, neglecting so-called unshared information. The group thus tends to focus on shared information at the expense of unshared information. This bias produces suboptimal team decisions. It differs from group think, which is defined as a quick and simple decision-making method [58] that is used by a very cohesive group, and a tendency to conform to social pressure that increases with the size of the group. These two biases are directly involved in deliberations about a clinical dilemma. Finally, the authors discuss informative and normative social influences, and highlight the social impact (in terms of risk) of inappropriate or changing behaviors intended to adapt to another person or group under conditions of uncertainty.
Turning to individual biases, Schleger et al. [53] highlight the risks due to just world bias, stereotypes, and omission bias. Just world bias refers to the assumption of a manageable and predictable world, which underlies how people orient themselves in their environment, notably the idea that patients’ medical problems are due to their own behavior [59]. The belief that everyone harvests what they sow risks balancing decision-making to the benefit of people to whom good things happen, and to the detriment of those to whom bad things happen. Stereotypes and prejudices are forms of implicit bias. Individuals are assumed to have specific traits and attributes, due to their characteristics. This overgeneralization with respect to the members of a social group is a general feature of humankind, but it can implicitly influence social judgement, and impact decision-making when faced with a clinical dilemma. Finally, omission bias refers to the observation that people often evaluate a decision to commit an action more negatively than a decision to omit an action, given that both decisions have the same negative consequence. This bias was observed in medical practice by Spranca et al. [60] under conditions of uncertainty (an intensive care unit). The healthcare practitioner is frequently exposed to the dilemma of passive non-action versus maintaining the present state with life-sustaining measures.
Blackstone et al. [51] focus on biases that may cloud the judgment of PWP committee members who work with the medical team and social workers, and may interfere with determining what is in the best interest of an incapacitated patient. They list five main biases related to information taking, discussion and information sharing, and decision-making processes. First, information taking may be biased by personal values and experience, due to a lack of information about the PWP’s life and values. An example is information bias, which refers to committee members overgeneralizing their personal experience, understood as an autobiographical mnesic bias. The success of the meeting is a function of members’ ability to share impartial medical information that is helpful in guiding decisions regarding the PWP, and the absence of groupthink, which is known to conflict with critical thinking [51]. The study reports that some of the interviewed PWP committee members highlighted the risk of affective bias when visiting a patient who was critically ill, or when they had to take into account the inconsistent or conflicting wishes of an awake, but incapacitated PWP. Finally, interviewees underlined the importance of the PWP protocol, which makes it possible to counterbalance the biases of each committee member, and institutional biases, in order to give an opinion without knowing, in fine, whether this is what the patient would have wanted.
Stanak and Hawlik [54] describe five categories of bias identified in interviews with neonatologists and the ethical expert. The first takes two forms: institutional and omission bias due to statistical data about viability, and the beliefs of the professional. The second refers to the parents’ understanding of information about viability, and their distress. Stanack and Hawlik [54] underline the importance of paternalism bias, given the high level of uncertainty experienced by parents. The latter can be likened to the authority bias. The third category focuses on the quality of the information that is shared between parents and professionals. Professionals must give medical information to parents, and this information will be the main element in the risk calculation that parents will have to make in order to take a decision. In this context, the authors note the risk of framing bias with respect to the estimated number of weeks of gestation. Framing bias is one of many cognitive biases; people react to a particular choice in different ways, depending on the way it is presented. This bias underlines the importance of the order in which medical information is given to parents. For example, when patients are presented with a list of complications that starts with the rarest, and ends with the most common, they tend to decide against the intervention. Similarly, it is well-known that communicating proportional outcomes (the majority of information given to parents is in this form), is a source of bias for individuals who do not understand proportions and percentages. The latter observation relates to intuitive versus reasoning thinking that Kahneman [26] describes as System 1 and System 2.
An overview of the reported biases according to the two main categories were presented Table 2.
Discussion
The reported cognitive biases
This scoping review describes the types of cognitive bias that can influence CES deliberations. The corpus of data encompasses both empirical and theoretical papers. Our findings can help to better-integrate biases into decision-making models in situations that involve a clinical ethical dilemma, and decision-making contexts. The field is little-explored, and our scoping review can help to identify gaps and synthesize knowledge [42]. In particular, it may help CES members to better-address biases during their deliberations. Clinical ethics deliberation can be viewed as a space–time relationship between personal judgments, and a collegial discussion of a medical situation within an institution. Cognitive bias must be considered at several levels: the individual (the clinician, the family member, the guardian), the group, and the institution in which the CES operates.
Our review highlights many types of bias, although the list is non-exhaustive. The reported biases were described in terms of working human environment bias and information-gathering bias (Table 2). In light of these two categories of biases, it is possible to propose an analytical framework for cognitive biases in CES based on three levels: the individual level, the groupal level and the professional level. Each of these three levels can included cognitive biases from the working human environment and the information-gathering.
Individual cognitive biases include belief in a just world, framing bias, omission bias, and affective bias. Individual social biases refer to religious or cultural values, or stereotypes that can influence the judgment of the clinician. Such biases are not specific to decision-making in the clinical context. Nevertheless, they are reported in the two theoretical papers included in our corpus, and need to be explored in the context of real-life CES deliberations.
The second main category of bias relates to CES functioning. It includes two subtypes. The first subtype is communication bias, which is reported by both Blackstone et al. [51], and Stanack and Hawlik [54]. Communication bias refers to situational exchanges with stakeholders, in particular parents [54], and trusted family and friends [52]. These exchanges are part of the work of the CES, especially when clinical uncertainty is high in terms of medical sequelae [54], or the patient’s wishes [52]. Although the main focus is on paternalism and affective biases, communication bias also refers to how information is presented or gathered, and then synthesized by clinicians or ethicists to communicate uncertainty and help in making shared decisions. In particular, Stanack et al. [54] point out that statistics can be misunderstood by stakeholders, and that this approach constitutes a bias that degrades the shared message [61]. Bias in interpersonal communication is a recognized risk in the context of diagnostic errors [62], but less is known in the field of CES functioning. The second subtype is groupal bias; here the focus is mainly on hidden profile and group thinking. These biases are well-known obstacles to high-quality discussions [63], and the sharing of medical information. They hinder the sharing that is critical in guiding clinical decisions.
The third main category is professional bias. Deontological bias includes values bias, and legal and institutional bias. Work environment bias includes administrative structures, the organization, and corporatism. Such biases are mainly found among CES members, but are not directly dependent on actions that CES members could implement by themselves.
Finally, while the overview is not exhaustive and the taxonomy by no means is absolute, it provides initial guidance with respect to assessing the relevance of various biases for specifc kinds of clinical ethic work.
The determinants of cognitive biases
The above findings call for at least two comments that target the external determinants of the cognitive bias. First, whatever the bias, all authors highlighted the importance of the context CES members find themselves in, in particular time pressure. This stressor should be considered as another environmental factor impacting the consideration of clinical ethical dilemmas. It is well-known that the VUCA context favors a gray zone of decision-making, where different practices may lead to different outcomes, and it is known to challenge rational decision-making [56]. Yet the issue has been little-studied in the field of clinical decision-making [64], and there are even fewer studies in the context of clinical ethics decision-making. Most studies have been conducted in prehospital and disaster medicine VUCA [65, 66]. In line, the second comment highlights the affective biases. Such a bias was only reported in one of the four reviewed papers [52]. However, studies have demonstrated that affective bias induces by fatigue or negative affects shift the balance of processes toward more intuitive responses, facilitating biased judgements and decision-making [67]. For exemple, stress and fatigue are well known to produce irritability, intolerance, and other mood changes that will also exert an influence on judgment [67]. Furthermore, affects can be influenced by a variety of ambient, chronobiological, and other variables including VUCA context.
Cognitive bias and its consequences for ethical decision
It is interesting to note that the question of whether the reviewed biases impact the final decision is not discussed. By definition, the outcome is uncertain, as the situation is characterized by the fact that any decision has unpredictable consequences, regardless of any bias. This leads us to ask, what does a good CES decision look like? Is it carrying out the patient’s wishes, or is it acting in his/her best interest?
Prosaically, a good decision is one that produces good results. However, this definition is problematic. Firstly, it is not very useful: a CES makes a decision that is based on uncertainty, incomplete information, and risk-taking. Moreover, it is difficult to know what the situation will be when we evaluate the outcome of the decision. Good decisions can sometimes produce bad results, or vice versa. This is called chance, or luck [68]. Since the seventeenth century, decisions have been studied in terms of probability, leading us to propose the following definition: the right rational choice is the one that gives you the best chance of achieving your objectives at the time the decision is made, with the information available [68]. This means that it is possible to define objectives that are in accordance with existing ethical theories, and which can be applied to a clinical ethics dilemma. Consequently, the impact of biases could be evaluated by focusing on how, and in what situations, they conflict with ethical theories. Furthermore, if bias leads to an error that is both predictable and shared, it must be complemented by noise, in the form of a random, unpredictable error that induces variability in judgments.
A noise audit consists of asking different, equally-competent experts to evaluate the same problem. Empirical findings are inconsistent to a point that astonishes professionals themselves [68]. While some variability may be desirable in situations that benefit from a diversity of viewpoints, this is not the case for medical decisions. But what about CES decisions? The diversity of members’ points of view, however biased, is put to the test by ethical frameworks that discuss cognitive frugality and decision hygiene [68]. Cognitive frugality, in the clinical context, refers to the framework of ethical principles and values, which, in turn, refers to the level of expertise, and therefore the quality of training of CES members. Decision hygiene emphasizes the deliberative framework that supports an argumentative, contradictory discussion of the relevance of the analysis, questioning concepts and knowledge, and a quality assessment of the CES’s work. Group work is essential. It ensures that the training of each individual is not only based on that of the group, and ultimately leads to a form of shared responsibility [63].
Possible countermeasures
None of the authors focused on individual characteristics (apart from value and moral capacities), while providing ethical advice involves Person-Situation Interactionist ethical decision-making [69, 70]. Ethical competence requires normative knowledge, and the willingness to defend behavioral options in the face of resistance [71]. It can be understood as a complex mosaic of processes, where components and personal characteristics interact with situations in which taking responsibility is a must, rather than a single construct or personality trait. In particular, Pohling et al. [71] highlight the role of affective empathy, personal values, and the five-factor model of personality.
More recently, mindfulness disposition, which denotes non-judgmental attention to the present moment [72], has been linked to ethics [73, 74], ethical decision-making [75,76,77], and moral reasoning [78]. In the psychological literature, biases of attention and memory are suggested to be implied in affective biases [79]. One explanation for these suggestions is that affective biases increase the tendency of attention to return again and again to mental images that spark negative affect, and that mindfulness decreases proliferation by attenuating affective biases of attention and memory. Consequently, mindfulness, besides leading to increased awareness of one’s own emotional reactions, could lead to more accurate awareness, in particular by attenuating affective biases that underlie distortions of attention and memory [80, 81]. Furthermore, mindfulness could be an efficient way to detect decision-making bias, by enabling System 2 thinking. It could facilitate cognitive de-automatization in certain experimental situations [82, 83]. Furthermore, mindful meditation can suspend preconceived ideas, encourage cognitive flexibility, and support emotional regulation when faced with ethical dilemmas [84]. It also enhances moral reasoning, mindfulness itself, and compassion, while reducing egocentric bias, which is understood as the desire to only act for personal gain [76].
Furthermore, the observations lead us to ask what training is needed to ensure that cognitive frugality and decision hygiene enable the most-appropriate ethical advice to be given in response to a here-and-now ethical dilemma. Mindful functioning appears to aid self-regulation by supporting attentional control, emotional regulation, self-awareness [85], and moral awareness [82, 83]. From the perspective of dual process theories, it can help to facilitate cognitive de-automatization in ethical decision monitoring [24]. Cognitive frugality can also help in accepting the need to respect the ethical framework.
Key questions for the future
At the present time, many other questions remain unanswered. A first question refers to affective biases that are considered as ingrained in the psychology of subjects and can generally be harder to overcome than cognitive biases. Although these biases are not necessarily always errors, attention should be paid to affective conditions and any other significant extraneous factors which favors affective bias. It is important to examine them further to study how they interact with cognitive biases in the CES work environment. Another question concerns the ‘switch’ feature, which is the mechanism by which a reasoner can decide to shift between more intuitive and more deliberate processing [86]. The latter author proposes that a combination of intuitive activation, uncertainty monitoring, deliberation, and feedback components form the basic architecture of a dual process model that can explain how people switch between System 1 and System 2 thinking. Mindfulness functioning could facilitate the capacity for cognitive decoupling, and improve conscious access to the additional cues that are necessary for ethical decision-making [85, 86]. A second question concerns the benefit of mindfulness in dealing with noise. This topic needs further study. In particular, it would be interesting to better-understand the relationship between expertise and mindfulness functioning, to assess whether or not the two work in concert, and how they improve intra- and inter-subject variability in ethical decision-making situations. Answering these questions would open up applied perspectives based on the development of mindfulness, to help CES members manage bias. Mindfulness could be considered as part of System 3, a system of inhibition that manages inappropriate bias [87].
Limits
This review was based exclusively on full-text articles published in English. Hence, relevant abstracts, presentations, theses, and unpublished reports may have been excluded. Second, it was difficult to decide which articles to exclude, since there are no real descriptions of how CES members deliberate. We included two empirical articles that describe ethical deliberation in two CES, and two theoretical papers that discuss the conceptual study of cognitive bias. Future studies need to focus on CES deliberation in ecological conditions, and different clinical ethical dilemmas, as this would better-characterize how mental state bias, action bias, and inertia bias impact the quality of decision-making. Studies also need to investigate shared ethical decision-making. Lastly, an additional interesting avenue for future research would be to evaluate mindfulness functioning in order to better-understand whether certain cognitive factors can protect against bias and noise during deliberation.
Conclusion
This scoping review studied the literature on cognitive bias in clinical ethics deliberation. It shows that many common biases can be present. Our review highlights that ethical deliberation is particularly at risk of cognitive bias, whatever the clinical dilemma. Although it is difficult to propose a practical categorization of cognitive bias in this specific context, our analysis emphasizes the interest of mindful functioning, as it could help to develop cognitive frugality and decision hygiene. It could also help in conducting a deliberation that is as unbiased as possible, enabling the most appropriate ethical decision to emerge in response to the dilemma at hand.
Data availability
No datasets were generated or analysed during the current study.
Abbreviations
- CES:
-
Clinical ethics supports
- EDM:
-
Ethical decision-making
- VUCA:
-
Volatility, Uncertainty, Complexity and Ambiguity
- PWP:
-
Patients without proxies
References
Rasoal D, Skovdahl K, Gifford M, Kihlgren A. Clinical ethics support for healthcare personnel: an integrative literature review HEC Forum. 2017;29(4):313–46. https://doiorg.publicaciones.saludcastillayleon.es/10.1007/s10730-017-9325-4.
Tan DYB, Ter Meulen BC, Molewijk A, Widdershoven G. Moral case deliberation. Pract Neurol. 2018;18(3):181–6. https://doiorg.publicaciones.saludcastillayleon.es/10.1136/practneurol-2017-001740.
Gaudine A, LeFort SM, Lamb M, Thorne L. Ethical conflicts with hospitals: the perspective of nurses and physicians. Nurs Ethics. 2011;18(6):756–66. https://doiorg.publicaciones.saludcastillayleon.es/10.1177/0969733011401121.
Hurst SA, Perrier A, Pegoraro R, Reiter-Theil S, Forde R, Slowther AM, Garrett-Mayer E, Danis M. Ethical difficulties in clinical practice: experiences of European doctors. J Med Ethics. 2007;33(1):51–7. https://doiorg.publicaciones.saludcastillayleon.es/10.1136/jme.2005.014266.
Arries E. Virtue ethics: an approach to moral dilemmas in nursing. Curationis. 2005;28(3):64–72. https://doiorg.publicaciones.saludcastillayleon.es/10.4102/curationis.v28i3.990.
Moeller JR, Albanese TH, Garchar K, Aultman JM, Radwany S, Frate D. Functions and outcomes of a clinical medical ethics committee: a review of 100 consults. HEC Forum. 2012;24(2):99–114. https://doiorg.publicaciones.saludcastillayleon.es/10.1007/s10730-011-9170-9.
Cohen JS, Erickson JM. Ethical dilemmas and moral distress in oncology nursing practice. Clin J Oncol Nurs. 2006;10(6):775–80. https://doiorg.publicaciones.saludcastillayleon.es/10.1188/06.CJON.775-780.
Gutierrez KM. Critical care nurses’ perceptions of and responses to moral distress. Dimens Crit Care Nurs. 2005;24(5):229–41. https://doiorg.publicaciones.saludcastillayleon.es/10.1097/00003465-200509000-00011.
Jameton A. Nursing Practice: The Ethical Issues. Englewood Cliffs, NJ: Prentice-Hall, Inc; 1984.
Prentice T, Janvier A, Gillam L, Davis PG. Moral distress within neonatal and paediatric intensive care units: a systematic review. Arch Dis Child. 2016;101(8):701–8. https://doiorg.publicaciones.saludcastillayleon.es/10.1136/archdischild-2015-309410.
Sporrong SK, Höglund AT, Arnetz B. Measuring moral distress in pharmacy and clinical practice. Nurs Ethics. 2006;13(4):416–27. https://doiorg.publicaciones.saludcastillayleon.es/10.1191/0969733006ne880oa.
Ulrich C, O’Donnell P, Taylor C, Farrar A, Danis M, Grady C. Ethical climate, ethics stress, and the job satisfaction of nurses and social workers in the United States. Soc Sci Med. 2007;65(8):1708–19. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/j.socscimed.2007.05.050.
Cranford RE, Doudera AE. Law Med Health Care. 1984;12(1):13–20. https://doiorg.publicaciones.saludcastillayleon.es/10.1111/j.1748-720x.1984.tb01755.x.
Førde R, Pedersen R. Manual for working in a clinical ethics committee in secondary health services. Oslo: Centre for Medical Ethics, University of Oslo; 2012.
Abma TA, Nierse CJ, Widdershoven GA. Patients as partners in responsive research: methodological notions for collaborations in mixed research teams. Qual Health Res. 2009;19(3):401–15. https://doiorg.publicaciones.saludcastillayleon.es/10.1177/1049732309331869.
Molewijk B, Kleinlugtenbelt D, Widdershoven G. The role of emotions in moral case deliberation: theory, practice, and methodology. Bioethics. 2011;25(7):383–93. https://doiorg.publicaciones.saludcastillayleon.es/10.1111/j.1467-8519.2011.01914.x.
Taylor RM. Ethical principles and concepts in medicine. Handb Clin Neurol. 2013;118:1–9. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/B978-0-444-53501-6.00001-9.
Hain RDW. Core ethics for health professionals: principles. Issues and Compliance New Bioeth. 2018;24(2):193–5. https://doiorg.publicaciones.saludcastillayleon.es/10.1080/20502877.2018.1468598.
Beauchamp TL, Childress JF:Principles of biomedical ethics. 5th edition.New York: Oxford University Press; 2001.
Agich GJ. Defense mechanisms in ethics consultation. HEC Forum. 2011;23(4):269–79. https://doiorg.publicaciones.saludcastillayleon.es/10.1007/s10730-011-9165-6.
Steinkamp N, Gordijn B. Ethical case deliberation on the ward. A comparison of four methods. Med Health Care Philos. 2003;6(3):235–46. https://doiorg.publicaciones.saludcastillayleon.es/10.1023/a:1025928617468.
Tversky A, Kahneman D. Judgment under uncertainty: heuristics and biases. Science. 1974;185(4157):1124–31. https://doiorg.publicaciones.saludcastillayleon.es/10.1126/science.185.4157.1124.
Wilson TD, Brekke N. Mental contamination and mental correction: unwanted influences on judgments and evaluations. Psychol Bull. 1994;116(1):117–42. https://doiorg.publicaciones.saludcastillayleon.es/10.1037/0033-2909.116.1.117.
De Neys W, Glumicic T. Conflict monitoring in dual process theories of thinking. Cognition. 2008;106(3):1248–99. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/j.cognition.2007.06.002.
Evans JS. Dual-processing accounts of reasoning, judgment, and social cognition. Annu Rev Psychol. 2008;59:255–78. https://doiorg.publicaciones.saludcastillayleon.es/10.1146/annurev.psych.59.103006.093629.
Kahneman D. Thinking, Fast and Slow. New York (NY): Farrar, Straus and Giroux; 2011.
Blumenthal-Barby JS, Krieger H. Cognitive biases and heuristics in medical decision making: a critical review using a systematic search strategy. Med Decis Making. 2015;35(4):539–57. https://doiorg.publicaciones.saludcastillayleon.es/10.1177/0272989X14547740.
Croskerry P, Singhal G, Mamede S. Cognitive debiasing 1: origins of bias and theory of debiasing. BMJ Qual Saf. 2013;22 Suppl 2(Suppl 2):ii58–64. https://doiorg.publicaciones.saludcastillayleon.es/10.1136/bmjqs-2012-001712.
Norman GR, Monteiro SD, Sherbino J, Ilgen JS, Schmidt HG, Mamede S. The causes of errors in clinical reasoning: cognitive biases, knowledge deficits, and dual process thinking. Acad Med. 2017;92(1):23–30. https://doiorg.publicaciones.saludcastillayleon.es/10.1097/ACM.0000000000001421.
Croskerry P, Singhal G, Mamede S. Cognitive debiasing 2: impediments to and strategies for change. BMJ Qual Saf. 2013;22 Suppl 2(Suppl 2):ii65–72. https://doiorg.publicaciones.saludcastillayleon.es/10.1136/bmjqs-2012-001713.
Coen M, Sader J, Junod-Perron N, Audétat MC, Nendaz M. Clinical reasoning in dire times. Analysis of cognitive biases in clinical cases during the COVID-19 pandemic. Intern Emerg Med. 2022;17(4):979–88. https://doiorg.publicaciones.saludcastillayleon.es/10.1007/s11739-021-02884-9.
Hofmann B. Biases in bioethics: a narrative review. BMC Med Ethics. 2023;24(1):17. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s12910-023-00894-0.
Rest JR. Moral development: Advances in research and theory. New York (NY): Praeger; 1986.
Jones TM. Ethical decision making by individuals in organizations: An issue-contingent model. AMR. 1991;16(2):366–95. https://doiorg.publicaciones.saludcastillayleon.es/10.2307/258867.
Kahneman D. A perspective on judgment and choice: Mapping bounded rationality. Am Psychol. 2003;58(9):e697–720. https://doiorg.publicaciones.saludcastillayleon.es/10.1037/0003-066X.58.9.697.
Haidt J. The emotional dog and its rational tail: a social intuitionist approach to moral judgment. Psychol Rev. 2001;108(4):814–34. https://doiorg.publicaciones.saludcastillayleon.es/10.1037/0033-295x.108.4.814.
Sonenshein S. The role of construction, intuition, and justification in responding to ethical issues at work: The sensemaking-intuition model. AMR. 2017;32(4):1022–40.
Johnson MK, Weeks SN, Peacock GG, Domenech Rodríguez MM. Ethical decision-making models: a taxonomy of models and review of issues. Ethics Behav. 2022;32(3):195–209. https://doiorg.publicaciones.saludcastillayleon.es/10.1080/10508422.2021.1913593.
Toh HJ, Low JA, Lim ZY, Lim Y, Siddiqui S, Tan L. Jonsen’s four topics approach as a framework for clinical ethics consultation. Asian Bioeth Rev. 2018;10(1):37–51. https://doiorg.publicaciones.saludcastillayleon.es/10.1007/s41649-018-0047-y.
Légaré F, Adekpedjou R, Stacey D, Turcotte S, Kryworuchko J, Graham ID, Lyddiatt A, Politi MC, Thomson R, Elwyn G, Donner-Banzhoff N. Interventions for increasing the use of shared decision making by healthcare professionals. Cochrane Database Syst Rev. 2018;7(7):CD006732. https://doiorg.publicaciones.saludcastillayleon.es/10.1002/14651858.CD006732.pub4.
Tricco AC, Lillie E, Zarin W, O’Brien KK, Colquhoun H, Levac D, Moher D, Peters MDJ, Horsley T, Weeks L, Hempel S, Akl EA, Chang C, McGowan J, Stewart L, Hartling L, Aldcroft A, Wilson MG, Garritty C, Lewin S, Godfrey CM, Macdonald MT, Langlois EV, Soares-Weiser K, Moriarty J, Clifford T, Tunçalp Ö, Straus SE. PRISMA extension for scoping reviews (PRISMA-ScR): checklist and explanation. Ann Intern Med. 2018;169(7):467–73. https://doiorg.publicaciones.saludcastillayleon.es/10.7326/M18-0850.
Arksey H, O’Malley L. Scoping studies: towards a methodological framework. Int J Soc Res Methodol. 2005;8(1):19–32. https://doiorg.publicaciones.saludcastillayleon.es/10.1080/1364557032000119616.
Mak S, Thomas A. Steps for conducting a scoping review. J Grad Med Educ. 2022;14(5):565–7. https://doiorg.publicaciones.saludcastillayleon.es/10.4300/JGME-D-22-00621.1.
Peters MDJ, Marnie C, Tricco AC, Pollock D, Munn Z, Alexander L, et al. Updated methodological guidance for the conduct of scoping reviews. JBI Evid Synthesis. 2020;18(10):2119–26. https://doiorg.publicaciones.saludcastillayleon.es/10.11124/JBIES-20-00167.
Jackson N, Waters E. Guidelines for systematic reviews in health promotion and public health taskforce. Criteria for the systematic review of health promotion and public health interventions. Health Promot Int. 2005;20(4):367–74. https://doiorg.publicaciones.saludcastillayleon.es/10.1093/heapro/dai022.
CRD (Centre for Reviews and Dissemination). Systematic reviews: CRD’s guidance for undertaking reviews in health care. York, UK: York Publishing Services, Ltd ; 2009.
Smith V, Devane D, Begley CM, Clarke M. Methodology in conducting a systematic review of systematic reviews of healthcare interventions. BMC Med Res Methodol. 2011;11(1):15. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/1471-2288-11-15.
Graneheim UH, Lundman B. Qualitative content analysis in nursing research: concepts, procedures and measures to achieve trustworthiness. Nurse Educ Today. 2004;24(2):105–12. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/j.nedt.2003.10.001.
Hsieh HF, Shannon SE. Three approaches to qualitative content analysis. Qual Health Res. 2005;15(9):1277–88. https://doiorg.publicaciones.saludcastillayleon.es/10.1177/1049732305276687.
Pope C, Mays N, Popay J. How can we synthesize qualitative and quantitative evidence for healthcare policy-makers and managers? Healthc Manage Forum. 2006;19(1):27–31. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/S0840-4704(10)60079-8.
Blackstone E, Daly BJ, Griggins C. Making medical decisions for incapacitated patients without proxies: Part II. HEC Forum. 2020;32(1):47–62. https://doiorg.publicaciones.saludcastillayleon.es/10.1007/s10730-019-09388-2.
Magelssen M, Pedersen R, Førde R. Sources of bias in clinical ethics case deliberation. J Med Ethics. 2014;40(10):678–82. https://doiorg.publicaciones.saludcastillayleon.es/10.1136/medethics-2013-101604.
Albisser Schleger H, Oehninger NR, Reiter-Theil S. Avoiding bias in medical ethical decision-making. Lessons to be learnt from psychology research. Med Health Care Philos. 2011;14(2):155–62. https://doiorg.publicaciones.saludcastillayleon.es/10.1007/s11019-010-9263-2.
Stanak M, Hawlik K. Decision-making at the limit of viability: the Austrian neonatal choice context. BMC Pediatr. 2019;19:204. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s12887-019-1569-5.
Zeitlin J, Szamotulska K, Drewniak N, Mohangoo AD, Chalmers J, Sakkeus L, Irgens L, Gatt M, Gissler M, Blondel B. Euro-peristat preterm study group. Preterm birth time trends in Europe: a study of 19 countries. BJOG. 2013;120(11):1356–65. https://doiorg.publicaciones.saludcastillayleon.es/10.1111/1471-0528.12281.
Waldeck R. Decision making in a VUCA context," Working Papers ; 2022. hal-03763719, HAL.
Stasser G, Titus W. Pooling of unshared information in group decision making: Biased information sampling during discussion. J Pers Soc Psychol. 1985;48(6):1467–78. https://doiorg.publicaciones.saludcastillayleon.es/10.1037/0022-3514.48.6.1467.
Janis I. Victims of Groupthink: A Psychological Study of Foreign-Policy Decisions and Fiascoes. Boston: Houghton Mifflin; 1972.
Lerner MJ. The belief in a just world. in The Belief in a Just World: A Fundamental Delusion, ed. M. J. Lerner (New York, NY: Springer). 1980:9–30. https://doiorg.publicaciones.saludcastillayleon.es/10.1007/978-1-4899-0448-5_2.
Spranca M, Minsk E, Baron J. Omission and Commission in Judgment and Choice. J Exp Soc Psychol. 1991;27:76–105. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/0022-1031(91)90011-T.
Pennycook G, Cheyne JA, Barr N, Koehler DJ, Fugelsang JA. The role of analytic thinking in moral judgements and values. Think Reason. 2014;20(2):188–214. https://doiorg.publicaciones.saludcastillayleon.es/10.1080/13546783.2013.865000.
Dahm MR, Cattanach W, Williams M, Basseal JM, Gleason K, Crock C. Communication of diagnostic uncertainty in primary care and its impact on patient experience: an integrative systematic review. J Gen Intern Med. 2023;38(3):738–54. https://doiorg.publicaciones.saludcastillayleon.es/10.1007/s11606-022-07768-y.
De HJ. l’éthique de la discussion. Paris: Flammarion; 1991.
Hirsch E. Horses and zebras: probabilities, uncertainty, and cognitive bias in clinical diagnosis. Am J Obstet Gynecol. 2020;222(5):469.e1–469.e3. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/j.ajog.2020.01.010.
Cuthbertson J, Penney G. Ethical decision making in disaster and emergency management: a systematic review of the literature. Prehosp Disaster Med. 2023;38(5):622–7. https://doiorg.publicaciones.saludcastillayleon.es/10.1017/S1049023X23006325.
Torabi M, Borhani F, Abbaszadeh A, Atashzadeh-Shoorideh F. Experiences of pre-hospital emergency medical personnel in ethical decision-making: a qualitative study. BMC Med Ethics. 2018;19(1):95. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s12910-018-0334-x.
Croskerry P. Diagnostic Failure: A Cognitive and Affective Approach. In: Henriksen K, Battles JB, Marks ES, et al., editors. Advances in Patient Safety: From Research to Implementation (Volume 2: Concepts and Methodology). Rockville (MD): Agency for Healthcare Research and Quality (US); 2005 Feb. Available from: https://www.ncbi.nlm.nih.gov/books/NBK20487.
Kahneman D, Sibony O, Sunstein CR. Noise. Eds O Jacob, Paris ; 2021.
Trevino LK. Ethical decision making in organizations: a person-situationinteractionist model. AMR. 1986;11(3):601–17. https://doiorg.publicaciones.saludcastillayleon.es/10.2307/258313.
Schwartz MS. Ethical Decision-Making Theory: An Integrated Approach. J Bus Ethic. 2016;139:755–76. https://doiorg.publicaciones.saludcastillayleon.es/10.1007/S10551-015-2886-8.
Pohling R, Bzdok D, Eigenstetter M, Stumpf S, Strobel A. What is ethical competence? The role of empathy, personal values, and the five-factor model of personality in ethical decision-making. J Bus Ethics. 2016;137:449–74. https://doiorg.publicaciones.saludcastillayleon.es/10.1007/s10551-015-2569-5.
Chiesa A. The difficulty of defining mindfulness: Current thought and critical issues. Mindfulness. 2013;4(3):255–68. https://doiorg.publicaciones.saludcastillayleon.es/10.1007/s12671-012-0123-4.
Amaro A. A holistic mindfulness Mindfulness. 2015;6:63–73. https://doiorg.publicaciones.saludcastillayleon.es/10.1007/s12671-014-0382-3.
Verhaeghen P. Good and well: the case for secular Buddhist ethics. Contemporary Buddhism. 2015;16(1):43–54. https://doiorg.publicaciones.saludcastillayleon.es/10.1080/14639947.2015.1006802.
Craft JL. A review of the empirical ethical decision-making literature: 2004–2011. J Bus Ethics. 2013;117(2):221–59. https://doiorg.publicaciones.saludcastillayleon.es/10.1007/s10551-012-1518-9.
Pandey A, Chandwani R, Navare A. How can mindfulness enhance moral reasoning? An examination using business school students. Business Ethics: A European Review. 2018;27(1):56–71. https://doiorg.publicaciones.saludcastillayleon.es/10.1111/beer.12171.
Valentine S, Godkin L, Varca P. Role conflict, mindfulness, and organizational ethics in an education-based healthcare institution. J Bus Ethics. 2010;94(3):455–69. https://doiorg.publicaciones.saludcastillayleon.es/10.1007/s10551-009-0276-9.
Small C, Lew C. Mindfulness, moral reasoning and responsibility: towards virtue in ethical decision-making. J Bus Ethics. 2019;69(1):103–17. https://doiorg.publicaciones.saludcastillayleon.es/10.1007/s10551-019-04272-y.
Todd RM, Cunningham WA, Anderson AK, Thompson E. Affect-biased attention as emotion regulation. Trends Cogn Sci. 2012;16(7):365–72. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/j.tics.2012.06.003.
Brewer JA, Elwafi HM, Davis JH. Craving to quit: psychological models and neurobiological mechanisms of mindfulness training as treatment for addictions. Psychol Addict Behav. 2013;27(2):366–79. https://doiorg.publicaciones.saludcastillayleon.es/10.1037/a0028490.
Elliott R, Zahn R, Deakin JF, Anderson IM. Affective cognition and its disruption in mood disorders. Neuropsychopharmacology. 2011;36(1):153–82. https://doiorg.publicaciones.saludcastillayleon.es/10.1038/npp.2010.77.
Lachaud L, Jacquet B, Baratgin J. Reducing choice-blindness? An experimental study comparing experienced meditators to non-meditators. Eur J Invest Health Psychol Educ. 2022;12:1607–20. https://doiorg.publicaciones.saludcastillayleon.es/10.3390/ejihpe12110113.
Lachaud L, Jacquet B, Bourlier M, Baratgin J. Mindfulness-based stress reduction is linked with an improved Cognitive Reflection Test score. Front Psychol. 2023;14:1272324. https://doiorg.publicaciones.saludcastillayleon.es/10.3389/fpsyg.2023.1272324.
Lampe M, Engleman-Lampe C. Mindfulness-based business ethics education. AELJ. 2012;16(3):99–111.
Tang YY, Hölzel BK, Posner MI. The neuroscience of mindfulness meditation. Nat Rev Neurosci. 2015;16(4):213–25. https://doiorg.publicaciones.saludcastillayleon.es/10.1038/nrn3916.
De Neys W. Advancing theorizing about fast-and-slow thinking. Behav Brain Sci. 2023;46: e111. https://doiorg.publicaciones.saludcastillayleon.es/10.1017/S0140525X2200142X.
Cachia A, Roell M, Mangin JF, Sun ZY, Jobert A, Braga L, Houde O, Dehaene S, Borst G. How interindividual differences in brain anatomy shape reading accuracy. Brain Struct Funct. 2018;223(2):701–12. https://doiorg.publicaciones.saludcastillayleon.es/10.1007/s00429-017-1516-x.
Acknowledgements
Not applicable.
Funding
This work was supported by the Central staff of French Military Health Service (Grant No. 2023-PPRC-ESSPRESSO). The funder had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript. The statements in this article are those of the authors and in no way those of the French Military Health Service.
Author information
Authors and Affiliations
Contributions
MT conceived the study and carried out the analyzed and interpreted results and wrote the manuscript. LG and MT readed and selected the papers. NP writed the equation and searched the papers. LG, AL, NP, FGF and MT discussed results and critically review the manuscript. All authors approved the final version for publication.
Corresponding author
Ethics declarations
Ethics approval and consent to participate
Not applicable.
Consent for publication
Not applicable.
Competing interests
The authors declare no competing interests.
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/.
About this article
Cite this article
Giaume, L., Lamblin, A., Pinol, N. et al. Evaluating cognitive bias in clinical ethics supports: a scoping review. BMC Med Ethics 26, 16 (2025). https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s12910-025-01162-z
Received:
Accepted:
Published:
DOI: https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s12910-025-01162-z