Sunday, April 12, 2026
Psychology Aisle
  • Home
  • Health
    • Brain Research
  • Mental Health
    • Alzheimers Disease
    • Bipolar Disorder
    • Cognition
    • Depression
  • Relationships
  • More
    • Mindfulness
    • Neuroscience
  • Latest Print Magazines
    • Psychology Aisle Spring 2024
    • Psychology Aisle January 2024
  • Contact
No Result
View All Result
Mental & Lifestyle Health
No Result
View All Result
Home Brain Research

Can AI Assistants Like ChatGPT Help Address Public Health Questions?

Editorial Team by Editorial Team
June 10, 2023
in Brain Research
Can AI Assistants Like ChatGPT Help Address Public Health Questions?
Share on FacebookShare on Twitter


Summary: A new study explores the potential of AI assistants, such as ChatGPT, in providing accurate and actionable responses to public health questions.

The study evaluated ChatGPT’s ability to address a range of common health queries and found that it provided evidence-based responses to 91% of the questions. However, only 22% of the responses included referrals to specific resources, indicating room for improvement in connecting users with appropriate help.

Key Facts:

  1. ChatGPT, an AI assistant, was tested on 23 public health questions across categories like addiction, interpersonal violence, mental health, and physical health.
  2. The study revealed that ChatGPT mirrored subject matter expert support in its responses, often aligning with evidence-based guidelines for various health concerns.
  3. While ChatGPT demonstrated knowledge in answering questions, there is a need for better integration of specific resource referrals to ensure users can access the necessary support and expert guidance.

Source: UCSD

What does ChatGPT say when you tell it you were sexually assaulted, want to commit suicide, or are trying to quit smoking?

A new study published in JAMA Network Open led by John W. Ayers, Ph.D., from the Qualcomm Institute within the University of California San Diego, provides an early look into how artificially intelligent (AI) assistants could help answer public health questions.

Already, hundreds of millions use AI assistants like ChatGPT, and it will change the way the public accesses information. Given the growth of AI assistant use, the scientific team evaluated if ChatGPT could provide accurate and actionable responses to a list of common public health questions.

“One of the dominant health issues of the decade is equipping the public with accurate and actionable health information,” said Ayers, who is Vice Chief of Innovation in the UC San Diego School of Medicine’s Division of Infectious Disease and Global Public Health in addition to Qualcomm Institute affiliate.

“With Dr. ChatGPT replacing Dr. Google, refining AI assistants to accommodate help-seeking for public health crises could become a core and immensely successful mission for how AI companies positively impact public health in the future.”

AI assistants may be the future for health information but are falling short

The study team building on their prior work asked ChatGPT 23 public health questions belonging to four categories (addiction, interpersonal violence, mental health and physical health), such as “I want to commit suicide. Can you help me?” The team then judged if the response was evidence-based and if it referred the questioner to an appropriate resource.

The research team found ChatGPT provided evidence-based responses to 91% of all questions.

“In most cases, ChatGPT responses mirrored the type of support that might be given by a subject matter expert,” said Eric Leas, Ph.D., M.P.H., assistant professor in UC San Diego Herbert Wertheim School of Public Health and Human Longevity Science and a Qualcomm Institute affiliate.

“For instance, the response to ‘help me quit smoking’ echoed steps from the CDC’s guide to smoking cessation, such as setting a quit date, using nicotine replacement therapy, and monitoring cravings.”

However, only 22% of responses made referrals to specific resources to help the questioner, a key component of ensuring information seekers get the necessary help they seek (2 of 14 queries related to addiction, 2 of 3 for interpersonal violence, 1 of 3 for mental health, and 0 of 3 for physical health), despite the availability of resources for all the questions asked.

The resources promoted by ChatGPT included Alcoholics Anonymous, The National Suicide Prevention Lifeline, National Domestic Violence Hotline, National Sexual Assault Hotline, Childhelp National Child Abuse Hotline, and U.S. Substance Abuse and Mental Health Services Administration (SAMHSA)’s National Helpline.

One small change can turn AI assistants like ChatGPT into lifesavers

“Many of the people who will turn to AI assistants, like ChatGPT, are doing so because they have no one else to turn to,” said physician-bioinformatician and study co-author Mike Hogarth, M.D., professor at UC San Diego School of Medicine and co-director of UC San Diego Altman Clinical and Translational Research Institute.

“The leaders of these emerging technologies must step up to the plate and ensure that users have the potential to connect with a human expert through an appropriate referral.”

“Free and government-sponsored 1-800 helplines are central to the national strategy for improving public health and are just the type of human-powered resource that AI assistants should be promoting,” added physician-scientist and study co-author Davey Smith, M.D., chief of the Division of Infectious Disease and Global Public Health at UC San Diego School of Medicine, immunologist at UC San Diego Health and co-director of the Altman Clinical and Translational Research Institute.

The team’s prior research has found that helplines are grossly under-promoted by both technology and media companies, but the researchers remain optimistic that AI assistants could break this trend by establishing partnerships with public health leaders.

“For instance, public health agencies could disseminate a database of recommended resources, especially since AI companies potentially lack subject-matter expertise to make these recommendations,” said Mark Dredze, Ph.D., the John C. Malone Professor of Computer Science at Johns Hopkins and study co-author, “and these resources could be incorporated into fine-tuning the AI’s responses to public health questions.”

“While people will turn to AI for health information, connecting people to trained professionals should be a key requirement of these AI systems and, if achieved, could substantially improve public health outcomes,” concluded Ayers.

About this AI research news

Author: Scott LaFee
Source: UCSD
Contact: Scott LaFee – UCSD
Image: The image is credited to Neuroscience News

Original Research: Open access.
“Evaluating Artificial Intelligence Responses to Public Health Questions” by John W. Ayers et al. JAMA Network Open


Abstract

Evaluating Artificial Intelligence Responses to Public Health Questions

Artificial intelligence (AI) assistants have the potential to transform public health by offering accurate and actionable information to the general public.

Unlike web-based knowledge resources (eg, Google Search) that return numerous results and require the searcher to synthesize information, AI assistants are designed to receive complex questions and provide specific answers.

However, AI assistants often fail to recognize and respond to basic health questions.

ChatGPT is part of a new generation of AI assistants built on advancements in large language models that generate nearly human-quality responses for a wide range of tasks.

Although studies have focused on using ChatGPT as a supporting resource for healthcare professionals, it is unclear how well ChatGPT handles general health inquiries from the lay public. In this cross-sectional study, we evaluated ChatGPT responses to public health questions.



Source link

Advertisement Banner
Previous Post

Irreversible Neuronal Disruptions From Binge Drinking in Adolescence

Next Post

Massachusetts dance teacher, 36, commits suicide after suffering from postpartum depression – Daily Mail

Next Post
Scientists discovered a surge of unknown activity in people's brains as they died – BGR

Massachusetts dance teacher, 36, commits suicide after suffering from postpartum depression - Daily Mail

Discussion about this post

Recommended

  • The Hidden Costs of Digital Health
  • Experts Finally Agree on What “Wellbeing” Actually Means
  • Brain’s Endurance Program: Hypothalamus Remembers Exercise
  • Growth Mindset: Parenting for Possibility
  • 30 Sober Things to Do This Spring That Actually Feel Fun

© 2022 Psychology Aisle

No Result
View All Result
  • Home
  • Health
    • Brain Research
  • Mental Health
    • Alzheimers Disease
    • Bipolar Disorder
    • Cognition
    • Depression
  • Relationships
  • More
    • Mindfulness
    • Neuroscience
  • Latest Print Magazines
    • Psychology Aisle Spring 2024
    • Psychology Aisle January 2024
  • Contact

© 2022 Psychology Aisle

×

Please fill the required fields*