weboverSept2023

OPINION

Artificial intelligence in health care: should it replace physicians?

Abi Sriharan, DPhil, and Savithiri Ratnapalan, PhD

Sriharan A, Ratnapalan S. Artificial intelligence in health care: should it replace physicians?

Can J Physician Leadersh 9(2): 32-35

https://doi.org/10.37964/cr24768

Sriharan A, Ratnapalan S. Artificial intelligence in health care: should it replace physicians? Can J Physician Leadersh 9(2): 

The global Artificial Intelligence (AI) in health care market was worth $7.4 billion in 2021 and is expected to reach $48.77 billion by 2027.1 Although theoretical, recent reports estimate that AI has the potential to improve up to 40% of patient care, diagnostics, research, and administrative tasks.2 Are current AI solutions ready to replace physicians?

From task automation to decision support, from remote monitoring, imaging, and diagnostics to workflow optimization, AI systems are being deployed and tested to improve the daily functions of health care organizations. However, the current AI systems have yet to mature into the level of super intelligence needed to understand patients’ emotional and social contexts and provide patient-centred care.

What are the limitations of current AI systems?

Current AI solutions are skilled at performing specific tasks only within a particular setting. Consider Google Health’s AI solution (Google, Mountain View, CA, USA), which uses a deep-learning model to identify signs of diabetic retinopathy through high-quality eye scans, accurately diagnosing patients at a level comparable to human specialists in laboratory settings. However, when deployed in real-world environments in Thailand, where nurses conducted eye scans in variable lighting conditions, the AI rejected images that did not meet a certain quality threshold. This led to misdiagnoses and an increased workload for nurses to retake images and wasted patient time for follow-up appointments.3

Current AI solutions face data challenges. Differences in patient demographics and biases in data limit the generalizability of the performance of predictive models in health care. For example, in a 2022 study, DeepMind’s (London, UK) model for predicting acute kidney injury (AKI) performed well in male patients but poorly in female patients because of its original training context — United States Veterans’ Affairs — where patients are predominantly male.4

Similarly, current AI solutions have limited ability to learn from past experiences in dynamic contexts such as health care. A recent study of the ability of machine-learning models to diagnose or prognose COVID-19 from chest radiography or computed tomography scans found that none of the 400 AI models examined could accurately diagnose COVID-19 because of flaws in the development of the models or biases in the training data sets.5

Most important, AI implementation in health care is limited by lack of organizational readiness to adopt innovative technology. Take for instance IBM Watson for Oncology (IBM Watson Health, Cambridge, MA, USA), an AI tool designed to aid oncologists with treatment recommendations for cancer patients. The University of Texas MD Anderson Cancer Center in Houston invested $62 million during the first five years of partnership with Watson, but the project was not fully implemented because of difficulties in integrating Watson into the hospital setting.6

What is the role of AI in health care then?

Regardless of these limitations, AI is an expensive health care experiment that is rapidly advancing and is expected to become a crucial part of health professional teams. AI is already supporting physicians in three important tasks.

AI as an assistant to perform routine tasks, like capturing patient interactions, scheduling appointments, and synthesising records to reduce administrative burden. Here, technology has some intelligence and agency but still relies on the human user for the final decision and action. For example, Suki Assistant (Suki, Redwood, CA, USA) is an AI-powered, voice-enabled digital tool for doctors to dictate notes during patient encounters, rather than typing or manually entering data into electronic health records. It has been shown to decrease documentation time per patient by 62% and reduce after-hours charting time by 70%.7

AI as a partner to collaborate with the human user on a shared goal to improve the quality and efficiency of care delivery. At this level, technology has more intelligence and agency and can communicate, negotiate, and coordinate with the human user. For example, the da Vinci Surgical System (Intuitive Surgical, Sunnyvale, Calif., USA) uses robotic arms and an AI algorithm to help stabilize a surgeon’s hand movements and filter out tremors, resulting in steadier and more controlled surgical actions.8

AI as a task leader in achieving a goal, like remote monitoring of patients or virtual assistants providing personalized support to patients. At this level, technology has high intelligence and agency and can direct, motivate, and inspire the human user. Technology is responsible for the outcome and quality of the task. For example, Woebot (Woebot Health, San Francisco, CA, USA) is an AI-powered chatbot that uses natural language processing and learned responses to mimic conversation, remember past sessions, and provide mental health support and cognitive behavioural therapy to patients from the comfort of their smartphones.9

Will AI eventually replace physicians?

AI systems have demonstrated their ability to analyze extensive patient data proficiently, identifying trends and enabling early disease detection. They offer tailored options and interventions, enhancing the precision of medical care. In patient assessments, AI proves invaluable, particularly in time-sensitive situations, like emergency rooms and walk-in clinics, where efficient triaging reduces wait times. In addition, AI-powered solutions extend beyond the traditional hospital setting, enabling continuous patient monitoring that promptly alerts physicians to potential concerns. This approach minimizes infrastructure costs and fosters a patient-centric, value-based approach to care. AI’s prowess in data analysis and pattern recognition significantly enhances the accuracy of diagnostic processes, especially in medical imaging. Furthermore, the growing capability of AI to process natural language commands holds remarkable potential, ranging from providing preliminary medical guidance to managing patient reminders and streamlining administrative tasks.

Although AI’s impact on health care is undeniable, a complete replacement of doctors remains improbable because of the intricacies of medicine.10 Health care interactions between clinicians and patients are complex and often require cognitive skills to deconstruct patients’ narratives of their experience with illness and identify objective facts. Qualities like empathy and nuanced decision-making, crucial for intricate patient interactions, transcend AI’s capabilities. Nevertheless, when effectively designed and integrated, AI can empower physicians by handling tasks that don’t require human judgement. It will augment physicians’ capabilities and allow them to prioritize complex decision-making and patient-centred care. This synergy between AI and medical expertise heralds an exciting future for health care.

How can health care organizations create symbiotic relationships between physicians and AI?

Creating a symbiotic relationship between a physician and AI requires a strategic approach that combines organizational support, education, transparency, and a focus on patient well-being. Here’s how health care organizations can achieve this.

Bottom-up solutions: Encourage physicians to actively engage in the development of AI solutions that address real-world patient care challenges. Foster an environment where they can contribute their insights and experiences to create innovative AI-driven approaches.

Collaborative problem-solving: Promote collaboration between physicians and AI experts within the organization. This collaboration ensures that AI solutions align with the practical needs of health care providers, resulting in more relevant and valuable tools.

Transparency: Promote transparency in the use of AI systems. Ensure that physicians understand how AI-generated recommendations are reached and encourage AI systems to explain their decisions clearly. Transparent AI enhances trust and enables physicians to communicate treatment plans to patients confidently.

Patient-centric focus: Emphasize patient-centred care in AI implementation. Communicate to patients how AI supports their health care journey and its role in improving diagnostic accuracy, treatment options, and overall outcomes. This transparency reduces concerns about depersonalization and reinforces the human aspect of care.

Ethical guidelines: Establish and enforce clear ethical guidelines for AI use. Ensure that AI technologies are used responsibly and align with the organization’s values, strongly emphasizing patient safety, privacy, and well-being.

Accountability: Implement a robust AI governance framework that defines responsibility for AI-related decisions and errors. Prioritize patients’ welfare and families’ needs, and ensure that any issues arising from AI use are addressed promptly and transparently.

Upskilling and support: Create avenues for ongoing upskilling and support for physicians as AI technologies evolve. Ensure that physicians feel confident and equipped to collaborate with AI tools in complex clinical contexts.

By upholding these principles, organizations can successfully navigate the integration of AI while preserving the human-centric nature that defines the essence of medicine.

References

1. AI in Healthcare market analysis (2022–2027). Hyderabad, India: Market Data Forecast; 2023. Available: https://www.marketdataforecast.com/market-reports/artificial-intelligence-in-healthcare-market 

2. Muro M, Maxim R, Whiton J. Automation and artificial intelligence: how machines are affecting people and places. Washington, DC: Brookings Institute; 2019. 

3. Heaven WD. Google’s medical AI was super accurate in a lab. Real life was a different story. MIT Tech Rev 2020:27 April. Available: https://tinyurl.com/yvphs9dc  

4. Cao J, Zhang X, Shahinian V, Yin H, Steffick D, Saran R. et al. Generalizability of an acute kidney injury prediction model across health systems. Nat Mach Intell 2022;4:1121-9. https://doi.org/10.1038/s42256-022-00563-8 

5. Roberts M, Driggs D, Thorpe M, Gilbey J, Yeung M, Ursprung S, et al. Common pitfalls and recommendations for using machine learning to detect and prognosticate for COVID-19 using chest radiographs and CT scans. Nat Mach Intell 2021;3:199-217. https://doi.org/10.1038/s42256-021-00307-0 

6. Schmidt C. M.D. Anderson breaks with IBM Watson, raising questions about artificial intelligence in oncology. J Natl Cancer Inst 2017;109(5). https://doi.org/10.1093/jnci/djx113 

7. AAFP Innovation Labs. Using an AI assistant to reduce documentation burden in family medicine: evaluating the Suki Assistant. Leawood, Kansas: American Academy of Family Physicians; 2021. Available: https://www.aafp.org/dam/AAFP/documents/practice_management/innovation_lab/report-suki-assistant-documentation-burden.pdf 

8. Hamza H, Baez VM, Al-Ansari A, Becker AT, Navkar NV. User interfaces for actuated scope maneuvering in surgical systems: a scoping review. Surg Endosc 2023;37(6):4193-223. https://doi.org/10.1007/s00464-023-09981-0 

9. Fitzpatrick KK, Darcy A, Vierhile M. Delivering cognitive behavior therapy to young adults with symptoms of depression and anxiety using a fully automated conversational agent (Woebot): a randomized controlled trial. JMIR Ment Health 2017;4(2):e7785. https://doi.org/10.2196/mental.7785 

10. Meskó B, Hetényi G, Győrffy, Z. Will artificial intelligence solve the human resource crisis in healthcare? BMC Health Serv Res 2018;18(1):1-4.

Authors

Abi Sriharan, MSc, DPhil, is a senior scientist and the research director at the Krembil Centre for Health Management and Leadership, part of the Schulich School of Business at York University. She studies control systems and adaptive leadership behaviours in health sectors, specifically focusing on fostering innovation and understanding the dynamics of human-machine teams. @SriharanAbi

Savithiri Ratnapalan, MBBS, PhD, is director of the Health Systems Leadership and Innovation Program at the University of Toronto and professor of pediatrics and public health at the Temerty Faculty of Medicine, University of Toronto. 

Correspondence to: [email protected]

Disclaimer: Mention of specific products does not constitute their endorsement by the authors or publisher.

This article has been peer reviewed.