While there is clear potential to use ChatGPT in a clinical setting, researchers say the AI algorithm may not yet be a reliable way of replacing the family doctor, especially when it comes to making effective decisions about prescribing antibiotics for infections.
Source: University of Liverpool
Researchers from the University of Liverpool have tested whether the AI-powered chatbot ChatGPT could be used to make decisions about prescribing patients with antibiotics.
In a letter published in The Lancet Infectious Diseases, academics from the Institute of Systems, Molecular and Integrative Biology show that, while artificial intelligence can’t yet replace the family doctor, there is clear potential for technology to play a role in clinical practice.
The researchers presented ChatGPT with eight hypothetical infection scenarios which people would commonly consult their doctor about (such as a chest infection). They then assessed the advice delivered by the technology for its appropriateness, consistency and its impact on patient safety.
The assessment found that ChatGPT understood the scenarios and provided coherent answers, including disclaimers, and signposting patients to sources of advice. It also appeared to understand the need to only prescribe antibiotics when there was evidence of bacterial infection.
Interestingly, the AI tended to focus on the type of antibiotic prescribed in each scenario rather than other factors, reflecting the assumptions often initially made by doctors during consultation. Image is in the public domain
However, ChatGPT provided unsafe advice in complex scenarios and where important information was not explicitly provided.
Interestingly, the AI tended to focus on the type of antibiotic prescribed in each scenario rather than other factors, reflecting the assumptions often initially made by doctors during consultation.
Following the experiment, the researchers have now developed a checklist for standards that AI should meet in order to be considered for use in clinical practice in the future.
Co-author of the letter, Dr. Alex Howard said, “It was fascinating to see the potential of artificial intelligence in health care demonstrated through this experiment testing ChatGPT’s ability to give antibiotic treatment advice.”
“With the rise of antibiotic resistance posing a significant threat to global health, the ability of AI to provide accurate and safe treatment advice could revolutionize the way we approach patient care. We look forward to further exploration of this technology and its implications for the future of health care.”
Contact: Press Office – University of Liverpool
Original Research: Open access.
“ChatGPT and antimicrobial advice: the end of the consulting infection doctor?” by Alex Howard et al. Lancet Infectious Diseases
ChatGPT and antimicrobial advice: the end of the consulting infection doctor?
Generative artificial intelligence (AI) models have proliferated in the past 2 years. ChatGPT—a large language model (LLM) developed by OpenAI (San Francisco, CA)—mimics natural language and solves cognitive problems by reinforcing learning from online resources using human feedback.
Despite access to limited medical data, ChatGPT has medical licensing examination performance as an undergraduate third-year medical student, and has, therefore, stimulated urgent discussions within medicine.
Stokel-Walker and van Noorden discuss the implications of generative AI for science and describe how ChatGPT “could answer some open-ended medical queries almost as well as the average human physician could, although it still had shortcomings and unreliabilities.”
Comments