A Perth woman has taken to social media for advice after seeing her doctor using ChatGPT in an appointment.
The recent surge in the use of AI across most industries left people wondering — is this the new norm?
“This feels like crossing a line”
A Perth woman has posted her story on Reddit after seeing her GP use ChatGPT to get advice on her next steps.
The woman said that she went to the appointment with the intention of getting test results. However, she got the shock of her life when she saw her doctor copy her information into ChatGPT.
“Never felt like I was sat in front of a stupid doctor til now,” she wrote in the lengthy post.
“Feels like peak laziness and stupidity and inaccurate medical advice.”
The woman clarified that her doctor copied her blood test studies into ChatGPT, along with her age, resulting in a list of suggestions for what she should do next.
Knowing how normal it is for doctors to use Google or research other medical journals in an appointment, the woman was left wondering — is this normal?
“I’ve had doctors Google things or go on Mayo Clinic to corroborate their own ideas but this feels like crossing a line professionally and ethically and I probably won’t go back,” she wrote.
“Thoughts?? Are other people experiencing this when they go to the GP?”
The woman said that she spoke to AHPRA — the Australian Health Practitioner Regulation Agency — and will also be taking the issue to the Health and Disability Services Complaints Office.
Does AI belong in a doctor’s office?
Commenters on the post agreed with the poster wholeheartedly, expressing concerns about the legitimacy of an AI diagnosis.
But, according to Aussie GP Sam Hay, sometimes, it might be warranted.
“Does AI have a place in medicine? I think it does… in the right hands!” he told Kidspot.
“Any web-based search tool has its uses for doctors as a means of quickly finding information and facts. I use them daily, including in front of patients. But, and it’s a big but, you have to know how to trust the information that’s thrown up.”
Sam added that it’s for this reason that it’s so important to find a doctor that you trust.
“For me, it’s all about using reliable resources from reputable sources – which means I have to filter through those that are clearly gimmicks and advertisements. Is every doctor doing that? No, I don’t think so – and that’s why you have to do your research and develop relationships with a doctor you can communicate and relate to,” he said.
“When it comes to ChatGPT-like engines, once again, they can be used to guide a doctor and offer suggestions — basically to help us ‘not miss anything’ rather than ‘provide the diagnosis without thinking. ‘
“Your doctor should be ensuring any engine is including recent evidence, reputable sources, and industry standard guidelines. And nothing replaces clinical assessment — the skill is knowing how to interpret what ChatGPT throws up for the patient in front of me!”