Sylvester study finds limits in ChatGPT’s accuracy for blood cancer advice

Justin Taylor, M.D., senior author of the study and physician at Sylvester
Justin Taylor, M.D., senior author of the study and physician at Sylvester - University of Miami Health System
0Comments

Artificial intelligence is increasingly being used in healthcare, with patients turning to AI chatbots like ChatGPT for information and advice. A recent study by the Sylvester Comprehensive Cancer Center examined how well ChatGPT 3.5, a widely available version of OpenAI’s chatbot as of July 2024, responds to questions about blood cancer.

The research involved four hematology-oncology physicians who evaluated ChatGPT’s answers to ten questions that reflected typical patient concerns. The questions ranged from general topics such as side effects of chemotherapy to more specific queries about newer treatments like BCL-2 inhibitors.

Results showed that ChatGPT 3.5 performed better on general cancer-related questions than on those involving new therapies or specific treatment approaches. The chatbot received an average score of 3.38 out of 5 for general questions but only 3.06 for newer therapies. None of its responses received a perfect score from any evaluator.

Justin Taylor, M.D., senior author of the study and physician at Sylvester, cautioned patients: “I would warn patients to have some skepticism, especially about answers dealing with specific types of cancer and treatments, and check with their doctor.”

A key limitation identified was that ChatGPT 3.5’s knowledge base only included data up until 2021, meaning it could not provide information on medical advances made after that year. Dr. Taylor explained the importance of human expertise: “When new drugs or research findings emerge, oncologists check in with their colleagues, discuss the implications and think about how to adapt them to their patients.” He added, “Chatbots can’t provide that kind of nuance and personalized understanding.”

Despite these limitations, Dr. Taylor noted that AI tools can still be useful for organizing thoughts before appointments or identifying primary sources for further discussion with healthcare professionals. However, he emphasized: “Physician oversight remains essential for vetting AI-generated medical information before patient use.”

Patients are advised to consult their doctors regarding any information obtained from AI or online sources in order to receive guidance tailored to their individual health needs.



Related

Ana Benaduce, anatomy instructor at FIU

FIU student honors professor Ana Benaduce with billboard tribute

A Florida International University (FIU) student has expressed gratitude to his anatomy professor in an unusual way—by featuring her on a billboard along Florida’s turnpike.

Jeanette M. Nuñez, President

Miami food influencer George Arango shares insights ahead of South Beach Wine & Food Festival event

George Arango, a 2015 graduate of Florida International University (FIU), has built a successful career as a content creator under the name Mr. Eats305.

Chuck Hatcher, Florida State Parks Director

Olustee Battlefield reenactment marks US semiquincentennial with living history event

Florida’s Olustee Battlefield Historic State Park will host the 49th Annual Reenactment of the Battle of Olustee from February 13 to 15, marking a significant event as the United States observes its 250th anniversary.

Trending

The Weekly Newsletter

Sign-up for the Weekly Newsletter from South Florida Business Daily.