Authors:Â Cathleen Parsons, Jonathan Adams, and Yassin Engelberts
What does being a clinician truly mean? At its core, are we primarily busy solving a medical puzzle or focused on finding the most optimal treatment that meets the patient's needs? While many clinicians have long recognised the importance of patient-centred care, only recently was an amendment to the Dutch Medical Treatment Contracts Act (WGBO) introduced to ensure that patientsâ preferences and considerations are actively included in medical decision-making1. Still, implementing patient-centred care remains a daily challenge for clinicians.
Clinical decision support
Consider an elderly patient with a hip fracture. A hip fracture can have a significant impact on quality of life and reduce life expectancy. Surgery often seems the logical choice. Yet, studies have shown that non-operative treatment focused on pain management and comfort may be preferable in cases of frailty and limited life expectancy, where patients report similar levels of quality of life and treatment satisfaction2. How should clinicians approach decision making when factors beyond medical considerations come into play? The answer lies in making patient autonomy the guiding principle.
In this blog, we delve into the concept of patient autonomy through the lens of Joel scaffolding approach3. Anderson emphasises that patients can remain autonomous even when they rely on deliberation with others, such as clinicians or family members. In fact, such deliberation often helps patients better understand their values and consider alternative perspectives, ultimately extending their autonomy as long as they still feel that the resulting decision is 'their own'. Therefore, external support, such as clinical decision support tools, have the potential to enhance patients' sense of autonomy by increasing their deliberative capacity.
As artificial intelligence (AI) increasingly transforms healthcare, preserving patient autonomy must be a guiding principle in the design of AI-based clinical decision support systems (AI-CDSS). AI-CDSS can either help patients make more informed decisions by clarifying treatment options and encouraging clinicians to better understand patient values, or they can manipulate and marginalise patients by presenting one option as inherently superior. Anderson's scaffolding approach for autonomy requires that society actively evaluates and protects the environments and opportunities that enable people to develop their decision-making capacities.
 In the context of AI-CDSS, we recommend public and patient involvement (PPI).  PPI, at its core, is founded on the principle that individuals, drawing from their unique expertise gained through their experiences, actively participate in the planning and execution of research.4 This collaboration should be active and meaningful, meaning that the patient has input across all stages of the project, and that the design decision-making is guided by their input.5
The Role of Patient and Public Involvement
PPI in the design of AAI-CDSS is essential because it directly relates to the principle of autonomy, as patients should be able to make informed decisions about their healthcare. Through the inclusion of PPI during design process, developers can gain valuable insights into the specific challenges and needs that patients have when making these decisions. This involvement also promotes transparency and accountability.6
PPI will help to ensure that AI-CDSS are designed to be inclusive. By involving a diverse range of PPI representatives in the design process, developers can ensure that the systems are designed to be accessible to all patients, regardless of their background or circumstances.7 However, in health research, PPI frequently involves a homogeneous group of patientsâtypically white, (upper-)middle class, retired, without complex health or social care needs and often from a health or research background.8 We foresee that this lack of diversity will lead to the development of AI-CDSS that do not adequately address the needs of those who are most vulnerable or marginalized.
When PPI lacks diversity, it risks creating systems that fail to consider the unique challenges faced by different patient populations. This can result in biased or inaccessible AI-CDSS, which not only limits patientsâ their ability to make autonomous decisions but also leads to worse healthcare outcomes for certain patient populations. For example, cultural differences, language barriers, and varying levels of health literacy can all impact how patients interact with AI-CDSS.7 When these factors are not adequately considered, the AI-CDSS may provide irrelevant or incorrect information, thereby hindering patients' ability to make informed decisions about their care. Therefore, not only do we recommend PPI, but we also advocate for executing it through a Patient Power-Up (PPU) approach. This involves mindfully considering who you involve in the process to ensure a representative and diverse group of patients.
A key component of the PPU approach is making an effort to reach out to underrepresented groups. This includes patients who may be sicker or have less energy to participate. Effective outreach can involve partnering with community organizations and using accessible communication methods. To ensure meaningful participation, it is essential to provide patients with the resources and support they need. This can include educational materials to help them understand the process, training sessions to build their confidence, and compensation for their time.6
Ethical Principles and AI-CDSS
From a medical ethics perspective, the scaffolding approach offers a way to design AI-based clinical decision support systems (AI-CDSS) that respect the classic principles of autonomy, beneficence, nonmaleficence, and justice.9 These are by no means an entirely harmonious set of principles, and a patientâs autonomous choice may, for example, conflict with beneficence if they refuse a treatment that is likely to improve their well-being. Yet rather than prioritizing autonomy in isolation, AI-CDSS should be designed to balance these ethical principles, recognizing that they can sometimes be in tension, and the scarffolding approach can be a useful way of navigating these complex choices. As Anderson argues, âscaffolding cannot be provided by being nudged into mindlessly adopted socially appropriate behaviour, but the autonomy-augmenting effects of scaffolding require skillful engagement.â3 To achieve this engagement, ethical principles should be seen as distinct perspectives on how AI-CDSS can either enhance or undermine patient autonomy, providing the (Patient) Power-Up that is needed.
Autonomy ensures that patients can make informed and meaningful choices about their care. Anderson highlights that autonomy is not an isolated trait but a relational process, with patients often relying on external supportsâsuch as clinicians, family members, or decision aidsâto navigate complex medical choices. AI-CDSS, when properly designed, should function as deliberative scaffolds, expanding patientsâ and their family membersâ ability to understand options, weigh outcomes, and align decisions with their values. To ensure AI-CDSS supports reasoning rather than dictating outcomes, PPI must actively involve patients in design and refinement. However, traditional PPI often lacks diverse representation, limiting its effectiveness. The PPU approach corrects this by seeking out marginalized and underrepresented voices, ensuring AI-CDSS reflect a broad range of patient experiences and remains a tool for empowerment rather than passive compliance.
The principle of beneficence requires AI-CDSS to promote patient well-being by providing accurate, comprehensive, and comprehensible medical information. AI should enhance clinical decision-making, ensuring that patients are better informed, but it must avoid paternalismâwhere AI assumes what is best for the patient and encourages passive compliance. Instead, AI must function as a supportive partner, guiding rather than controlling decision-making. For AI-CDSS to fulfil beneficence, PPI and PPU are crucial. PPI ensures AI is developed with patient priorities in mind, while PPU broadens participation to include patients with diverse health needs and backgrounds. This guarantees AI-CDSSÂ are not just optimized for clinical efficiency but is also aligned with real-world patient values and experiences.
The principle of nonmaleficence â âdo no harmâ â is crucial to AI-CDSS development. While AI can improve decision-making, it also carries risks, particularly the overreliance on AI-generated recommendations. Anderson warns that external supports must not replace but rather enhance decision-making. This suggests that patients and clinicians alike must not defer blindly to AI, as this erodes trust in their own judgment. To prevent overreliance, AI-CDSS must be explainable and transparent, allowing patients and clinicians to question, override, or contextualize AI outputs. However, explainability alone is not enough, as users may still develop a strong dependence on AI, even when they understand how it works. Therefore, additional safeguards aimed at maintaining human control are necessary to ensure AI functions as a support tool rather than a lone decision-maker. PPI ensures that patients contribute to AI design, making systems more user-friendly and aligned with real concerns. PPU reinforces this by ensuring that PPI is not just a box-ticking exercise but an active engagement with the varied apprehensions of real patients.
The principle of justice demands that AI-CDSS be  equitable, inclusive and, by extension, accessible, because without accessibility certain populations will effectively be excluded Anderson argues that âhaving access to the scaffolding one needs is a crucial component of social justice,â3 which suggests that patients requiring external supports should not be disadvantaged. However, AI trained on non-representative data risks reinforcing health disparities, disproportionately affecting marginalized communities. To uphold justice, AI-CDSS must be co-designed with diverse patient input through PPI. Without representation from underprivileged patients, non-native speakers, or those with disabilities, AI-CDSS will fail to address the full spectrum of patient needs. PPU strengthens this effort by ensuring that AI-CDSS empowers rather than excludes, making healthcare decisions more equitable for all patients.
Conclusion
We see opportunity for AI-CDSS to enhance patient autonomy during shared decision making processes. The effect on patient autonomy requires active consideration during the design process to prevent patients being manipulated and marginalised. By involving a diverse range of PPI representatives in the design process, Anderson's scaffolding approach for autonomy can potentially be realized.
Â
Â
Reference
Meer âsamen beslissenâ nodig door aangescherpte Wgbo | NTVG
2 Loggers SAI, Willems HC, Van Balen R, et al. Evaluation of Quality of Life After Nonoperative or Operative Management of Proximal Femoral Fractures in Frail Institutionalized Patients: The FRAIL-HIP Study. JAMA Surg. 2022;157(5):424-434. doi:10.1001/jamasurg.2022.0089
3 Anderson, J. (2022). Scaffolding and autonomy. In The Routledge Handbook of Autonomy (pp. 158-166). Routledge.
4 D, et al. Monitoring and evaluation of patient engagement in Health Product Research and Development: Co-Creating A Framework for Community Advisory Boards. Journal of Patient-Centered Research and Reviews. 2022;9(1):46â57. doi:10.17294/2330-0698.1859
5 Harrington RL, Hanna ML, Oehrlein EM, Camp R, Wheeler R, Cooblall C, et al. Defining Patient Engagement in Research: Results of a Systematic Review and Analysis: Report of the ISPOR Patient-Centered Special Interest Group. Value Health. 2020;23(6):677 88.6 Karlsson AW, et al. Roles, outcomes, and enablers within research partnerships: a rapid review of the literature on patient and public involvement and engagement in health research. Res Involv Engagem. 2023;9(1):43.
7 Zidaru T, Morrow EM, Stockley R. Ensuring patient and public involvement in the transition to AI-assisted mental health care: A systematic scoping review and agenda for design justice. Health Expect. 2021 Aug;24(4):1072-1124. doi: 10.1111/hex.13299. Epub 2021 Jun 12. PMID: 34118185; PMCID: PMC8369091.
8 Maguire K, Britten N. âHow can anybody be representative for those kind of people?â forms of patient representation in health research, and why it is always contestable. Soc Sci Med. 2017;183:62â9.
9 Beauchamp TL, Childress JF. Principles of Biomedical Ethics. Oxford University Press; 1979.