Start of Main Content

ChatGPT: Friend or foe?

Apr 25th, 2023

By Ethan Popowitz 7 min read
chatgpt

ChatGPT is here and I, for one, welcome our new AI overlord.

In case you’ve been cut off from the Internet for the last six months (if so, welcome back!), ChatGPT is a natural language processing tool developed by OpenAI and trained on a massive set of data from across the Internet.

ChatGPT’s ability to engage in human-like conversations has captured the attention of millions of people, becoming the fastest-growing app of all time. According to an analysis by the Swiss bank UBS, the tool had 100 million active users in January 2023, only two months after its launch. For comparison, it took nine months for TikTok to reach that many users.

Even in the short time I’ve dabbled with it, I can understand the hype. I used ChatGPT to suggest ideas for a party, describe scenes for a short story I’m writing, explain how wormholes work, and even help write the blog you’re reading right now. And people more creative than me are using it to make music, code full websites, create recipes, cheat on their homework, and so much, much more.

With the capabilities of ChatGPT seemingly limited only by human imagination, it’s only natural to wonder how it can be used to revolutionize healthcare. Read on to see—partially in its own words—how the tool is transforming the healthcare industry.

Diagnosis and treatment

Doctors, rejoice! The era of patients saying “That’s not what I read on WebMD” may be coming to an end.

But don’t celebrate too quickly—you might just be trading one adage for another. By using a combination of the right prompts, ChatGPT can be used to diagnose medical conditions. As Dr. Prithvi Santana explains in his TikTok, the chatbot can analyze a patient’s symptoms and medical history to generate a list of potential diagnoses. It can also recommend appropriate investigations to order, and treatment options based on the diagnosis.

I tried it for myself, just to see it with my own eyes. Using a sample patient history from the University of North Carolina at Chapel Hill, I fed ChatGPT the patient’s symptoms (episodes of chest and neck pain, accompanied by shortness of breath) and their medical background (hypertension, total hysterectomy, family history of premature coronary artery disease).

Using the information I gave, the chatbot suggested the patient could have coronary artery disease or myocardial infarction. It then suggested several investigations, such as an ECG and blood tests, to determine the cause of the patient’s chest pain. It even suggested possible treatment options, such as lifestyle modifications, prescription medication, and a referral to a cardiologist in case of a serious heart condition.

So, what does this mean? Will ChatGPT come for the jobs of our doctors and nurses?

Probably not, but they may start hearing “That’s what I read on ChatGPT” a lot more often, as patients use it to learn more about and diagnose their conditions. According to Dr. Santana, the tool will accelerate the performance of doctors, as the tool’s capacity to synthesize and interpret medical information can help healthcare providers save valuable time and afford to care for more patients.

Medical education

While ChatGPT is raising concerns about plagiarism and cheating, it can still be used in several ways to improve the quality of education for medical students. Here are a few examples:

  1. Generating case scenarios. Like the example I walked through in the above section, the chatbot can generate case studies and scenarios to help medical students improve their diagnostic and treatment-planning abilities. This not only has the potential to help students develop their clinical reasoning skills but can also prepare them for real-world clinical scenarios.
  2. Teaching assistance. Instructors can use ChatGPT to generate exercises, quizzes, flashcards, and other materials to help students learn. With a few added prompts, it can even assess student papers and automate grading.
  3. Language translation. Not only can students and teachers use it to translate class materials into different languages, but it can also help future healthcare providers explain complex medical conditions and processes to patients in simpler terms.

ChatGPT can also assist both students and professionals with their medical research. It can quickly digest and analyze medical texts, clinical trials, research papers, and other sources of data. This can be used to generate summaries and highlight key findings. More interestingly, it can be used to identify patterns and trends, which could help researchers create new hypotheses and advance medical breakthroughs.

Enhancing the doctor’s office experience

Telehealth technology has already made it easier than ever for patients to engage with their doctors and get the care they need. Beyond the ability to interact with a physician through video or over the phone, many telehealth platforms offer patient portals, scheduling services, and more.

ChatGPT can take this experience to the next level. Currently, developers can integrate the technology into their apps and products through an API.

This can take the shape of a sophisticated virtual assistant or chatbot that can assist healthcare providers by being a ‘first point of contact’ with patients. A ChatGPT-enhanced chatbot can help patients get quick answers to their questions, seamlessly schedule appointments, and send patients reminders for medication. In addition, ChatGPT can near-instantly generate full transcripts of meetings for both the patient’s and the provider’s records and identify action items for the next meeting, keeping everyone aligned on what needs to be done.

Assistance with medical writing

In an article published by the Radiological Society of North America, Dr. Som Biswas stated that ChatGPT can greatly improve the speed and accuracy of document creation.

By giving it the proper prompts, Dr. Biswas found that the tool was incredibly helpful in accelerating the process of writing radiology reports. It took only a matter of seconds for ChatGPT to generate a “good enough” radiology report that can be used as a draft for a human to work off. Dr. Biswas notes that a human is still needed, as the report featured some incorrect words, but the total time saved meant that he and his team had more time to focus on more important tasks.

Beyond report writing, healthcare organizations can use ChatGPT to draft preauthorization letters to insurance providers and care instructions for patients following a procedure. When so many healthcare workers consider administrative tasks as a primary cause of burnout, using this technology to automate the process (or reduce the hours spent per week doing such tasks) can help healthcare workers stay mentally engaged and focused on delivering care to their patients.

Risks and concerns with ChatGPT

While ChatGPT does have the potential to be useful, it can also be a double-edged sword. Here are just a few possible risks and concerns patients, students, providers, and professionals should be aware of should they consider using it.

You can’t blindly trust it

With ChatGPT’s ability to answer seemingly any question imaginable, users may blindly trust the system to be 100% truthful all the time.

It’s not. The chatbot’s answers are delivered in a manner that appears to be confident and comprehensible, even when the answer is incorrect. This can be particularly dangerous in the healthcare world, as it can erode the carefully built relationship of trust between doctor and patient. And for now, even OpenAI admits that their creation’s tendency to write plausible sounding but incorrect or nonsensical answers is a challenging issue to fix.

It can be vulnerable to hacking and cybercrime

The healthcare industry is a very common target for cybercrime. In fact, more than 52 million people had their private health information (PHI) exposed in more than 700 data breaches throughout 2022.

Knowing this, healthcare organizations should implement robust security measures to protect patient data before using ChatGPT. This includes measures like encryption, access controls, and user authentication, and other ways to safeguard systems against a data breach.

Unfortunately, patients may be putting their PHI at risk without even fully realizing it. When engaging in a dialogue with the chatbot, they may inadvertently hand over their PHI in exchange for answers to their symptoms and conditions. As of writing, ChatGPT does store the information you give it—as it uses that data to train itself—meaning any PHI you hand over could potentially be accessible by millions of people.

It’s not 100% accurate

Accuracy is another major concern. Use the chatbot long enough and you may notice it tends to cite sources that never existed, use fake statistics, and generate incorrect information presented as fact.

Part of the problem is that ChatGPT is only as accurate and reliable as the data used to train it. Currently, it was trained on data published before 2021, meaning it could have incorrect or outdated medical information.

While ChatGPT does have the ability to admit its own mistakes, it’s entirely dependent on the user to challenge it. This requires the user to recognize that the answer they’ve been given is incorrect or misinterpreted, which may not always be the case if the user is searching for answers to questions they wouldn’t already know.

It won’t replace humans just yet

While ChatGPT has many applications, it suffers from problems that can only be solved by a human.

  • Bias: Like humans, ChatGPT learns from large datasets that may include biased or incomplete information, which can lead to inaccurate or unfair recommendations for patients, especially those from underrepresented groups. Human healthcare professionals, however, undergo training and have access to resources that combat inherent biases.
  • Lack of empathy: Despite the chatbot’s responses appearing warm and welcoming, it cannot actually empathize with users or understand their emotions, making it ill-equipped to handle sensitive conversations.
  • Lack of experience: ChatGPT can’t replace the intrinsic value of the patient/physician relationship, and it lacks the wisdom providers possess to consider the needs of the patient when developing a treatment plan.

How will the future of healthcare unfold?

Is ChatGPT the next big technological advancement in healthcare? Does it signal the rise of Skynet?

Like all good things, the applications and benefits of ChatGPT will likely fall somewhere in between. That being said, the challenges and risks associated with the technology are real and should be carefully considered.

Many healthcare providers believe that ChatGPT can revolutionize the healthcare industry—if used effectively. But it might be helpful to rewatch Terminator.

In case the robot uprising does come sooner than expected, arm yourself with healthcare commercial intelligence and get a comprehensive view of the healthcare landscape. Or start a free trial with Definitive Healthcare today.

Ethan Popowitz

About the Author

Ethan Popowitz

Ethan Popowitz is a Senior Content Writer at Definitive Healthcare. He writes data-driven articles about telehealth, AI, the healthcare staffing shortage, and everything in…

Author profile