When The AI Is More Compassionate Than The Doctor

0

As artificial intelligence (AI) continues to make inroads into medical practice, many physicians expected that these new software tools would help them in highly-technical tasks such as detecting early breast cancers on mammogram images or predicting which patients were at highest risk for developing serious infections such as sepsis.

In an unexpected twist, physicians are also finding tools like ChatGPT helpful in the more “human” aspects of medical care—specifically helping to communicate with patients with greater empathy and compassion. A couple of months ago, I described a recent study in JAMA Intern Med that compared how AI chatbots and human physicians responded to patient medical questions offered on social media. To everyone’s surprise, the judges (who were blinded to the authors of the answers), the chatbot answers were considered better both in terms of information quality and empathy than the human doctors’ answers!

Several physicians have taken this one step further and are using AI tools to help them communicate better with real-world patients. For example, ER physician Dr. Josh Tamayo-Sarver described an encounter he had with a patient’s family who kept insisting that he give IV fluids to their seriously ill mother when he knew that would be the exact wrong treatment for her particular condition. After trying repeatedly to explain why she needed a different treatment plan, Dr. Tamayo-Sarver finally fired up ChatGPT-4 and asked it, “Explain why you would not give IV fluids to someone with severe pulmonary edema and respiratory distress even though you might be concerned that the patient is dehydrated. Explain it in simple and compassionate terms so that a confused person who cares about their mother can understand.”

Within a few seconds, ChatGPT generated a thoughtful, empathetic response. (It’s worth reading the response in full.) As Dr. Tamayo-Sarver reports, “I printed this response up, and read it to the concerned family. As I recited ChatGPT’s words, their agitated expressions immediately melted into calm agreeability.” By using ChatGPT, he was able to comfort the patient’s family, while also freeing up valuable time to take care of the many other sick patients in the ER that night.

Similarly, New York Times reporter Gina Kolata described how Dr. Gregory Moore was able to use ChatGPT to help counsel a friend with advanced cancer, including how to break bad news to her about the lack of effective treatments, and how to grapple with the fact that she wouldn’t be able to attend events important to her and her loved ones two years in the future. He was “stunned” at how compassionate the computer’s answers were. As Dr. Moore said, “I wish I would have had this when I was in training. I have never seen or had a coach like this.”

Likewise, New York Times piece explained why rheumatologist Dr. Richard Stern routinely uses ChatGPT in his clinical practice: “It writes kind responses to his patients’ emails, provides compassionate replies for his staff members to use when answering questions from patients who call the office and takes over onerous paperwork.”

There are many reasons that human physicians often don’t display sufficient empathy when talking with patients. One might be insufficient coaching in medical school and residency training. Another could be due to the delicate psychological balance that physicians must maintain between seeing the patient as a whole person, while also holding in their minds the myriad technical medical data about the patient’s symptoms, test results, possible diagnoses, and optimal treatment choices. Another is undoubtedly the constant stress of the heavy (and ever increasing) workloads almost all physicians are struggling with. On the other hand, Chat GPT and other AIs never get hungry, never need to sleep, and never have to worry about maintaining a work-life balance.

I personally did not expect that AI tools like ChatGPT would be this effective at displaying empathy and compassion. (Or more precisely, at generating text responses that when read by patients makes them feel they are receiving empathy and compassion from another person.) To the extent that these tools help physicians take better care of their patients, I’m all in favor them.

Dr. Tamayo-Sarver summarizes: “I’ve taken to using ChatGPT to help empathically explain specific medical scenarios to patients and their loved ones. It’s become an invaluable resource for the frequent situations where my ER ward is too busy or short-staffed for explaining complex medical diagnoses in a way that is accurate but easy to understand.”

However, the rise of “empathetic” AI leads to interesting broader questions. For example:

  • What is the proper role of AI chatbots in emotionally delicate fields such as psychotherapy?
  • What will happen as more people seek out “AI girlfriends” or “boyfriends” in lieu of human romantic partners?
  • What will be the social repercussions when AIs can reliably “push emotional buttons” in voters for political purposes? (The New York Times recently reported that, “The Democratic Party experimented with fund-raising messages drafted by artificial intelligence in the spring — and found that they were often more effective at encouraging engagement and donations than copy written entirely by humans.”)

We don’t yet know the answers to these questions. But it will be interesting to see how “empathetic” AI technology unfolds over the next few years!

Stay connected with us on social media platform for instant update click here to join our  Twitter, & Facebook

We are now on Telegram. Click here to join our channel (@TechiUpdate) and stay updated with the latest Technology headlines.

For all the latest Health & Fitness News Click Here 

Read original article here

Denial of responsibility! Rapidtelecast.com is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – [email protected]. The content will be deleted within 24 hours.
Leave a comment