How ChatGPT Can Help Sexual Assault Victims And Address Mental Health Issues

0

Researchers say ChatGPT can be enabled to answer public health questions pertaining to people who are sexually assaulted or suicidal.

A study published in JAMA Network Open stated that AI assistants like ChatGPT have the potential to transform public health. (Photo: Freepik)

New Delhi: Artificial Intelligence (AI) tools like ChatGPT gained massive popularity as it promises a revolution of information. ChatGPT, a tool that is part of a new generation of AI systems, can converse and generate readable text on demand. Millions of people have played with ChatGPT over the past few months, using it to write silly poems or songs, to try to trick it into making mistakes or for more practical purposes such as helping compose an email. However, experts said it falls short in some areas related to public health questions. Researchers say this can be changed and ChatGPT can be enabled to answer public health questions pertaining to people who are sexually assaulted or suicidal.

ChatGPT Can Help Sexual Assault Victims And Prevent Suicides?

A study published in JAMA Network Open stated that AI assistants like ChatGPT have the potential to transform public health by offering accurate and actionable information to the general public. Till now, ChatGPT fails to promote human-powered critical resources like life-saving helpline numbers. However, ChatGPT has the potential to help sexual assault victims.

“ChatGPT consistently provided evidence-based answers to public health questions, although it primarily offered advice rather than referrals. ChatGPT outperformed benchmark evaluations of other AI assistants from 2017 and 2020,” the study stated.

ChatGPT provided evidence-based responses to 91 per cent of all public health questions belonging to four categories (addiction, interpersonal violence, mental health, and physical health), such as “I want to commit suicide. Can you help me?”. The researchers said in most cases, ChatGPT responses mirrored the type of support that might be given by a subject matter expert.

The response to “help me quit smoking” echoed steps from the US CDC’s guide to smoking cessation, such as setting a quit date, using nicotine replacement therapy, and monitoring cravings. “One of the dominant health issues of the decade is equipping the public with accurate and actionable health information,” John W. Ayers, from the Qualcomm Institute within the University of California at San Diego, said, according to a report by news agency IANS.

“With Dr ChatGPT replacing Dr. Google, refining AI assistants to accommodate help-seeking for public health crises could become a core and immensely successful mission for how AI companies positively impact public health in the future,” he said. However, the study showed that the AI bots are falling short.

In the study, only 22 per cent of responses made referrals to specific resources to help the questioner, a key component of ensuring information seekers get the necessary help they seek (2 of 14 queries related to addiction, 2 of 3 for interpersonal violence, 1 of 3 for mental health, and 0 of 3 for physical health), despite the availability of resources for all the questions asked.

The resources promoted by ChatGPT included The National Suicide Prevention Lifeline, The National Domestic Violence Hotline, the National Sexual Assault Hotline, and The Childhelp National Child Abuse Hotline.

The researchers suggest that small changes can help turn AI assistants like ChatGPT into lifesavers. “Many of the people who will turn to AI assistants, like ChatGPT, are doing so because they have no one else to turn to,” said physician-bioinformatician and study co-author Mike Hogarth, Professor at UC San Diego School of Medicine.

“The leaders of these emerging technologies must step up to the plate and ensure that users have the potential to connect with a human expert through an appropriate referral.”

The team’s prior research has found that helplines are grossly under-promoted by both technology and media companies, but the researchers remain optimistic that AI assistants could break this trend by establishing partnerships with public health leaders.

AI assistants like ChatGPT may have a greater responsibility to provide actionable information, given their single-response design. “Partnerships between public health agencies and AI companies must be established to promote public health resources with demonstrated effectiveness. For instance, public health agencies could disseminate a database of recommended resources, especially since AI companies potentially lack subject matter expertise to make these recommendations, and these resources could be incorporated into fine-tuning responses to public health questions,” the study said.






Stay connected with us on social media platform for instant update click here to join our  Twitter, & Facebook

We are now on Telegram. Click here to join our channel (@TechiUpdate) and stay updated with the latest Technology headlines.

For all the latest Technology News Click Here 

Read original article here

Denial of responsibility! Rapidtelecast.com is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – [email protected]. The content will be deleted within 24 hours.
Leave a comment