M
MercyNews
HomeCategoriesTrendingAbout
M
MercyNews

Your trusted source for the latest news and real-time updates from around the world.

Categories

  • Technology
  • Business
  • Science
  • Politics
  • Sports

Company

  • About Us
  • Our Methodology
  • FAQ
  • Contact
  • Privacy Policy
  • Terms of Service
  • DMCA / Copyright

Stay Updated

Subscribe to our newsletter for daily news updates.

Mercy News aggregates and AI-enhances content from publicly available sources. We link to and credit original sources. We do not claim ownership of third-party content.

© 2025 Mercy News. All rights reserved.

PrivacyTermsCookiesDMCA
Home
Technology
ChatGPT Becomes the New WebMD for Legal and Medical Advice
TechnologysocietyHealth

ChatGPT Becomes the New WebMD for Legal and Medical Advice

January 7, 2026•12 min read•2,238 words
ChatGPT Becomes the New WebMD for Legal and Medical Advice
ChatGPT Becomes the New WebMD for Legal and Medical Advice
📋

Key Facts

  • ✓ 57% of consumers have used or would use AI to answer legal questions, according to a December 2025 Clio survey
  • ✓ One in three Americans use generative AI tools for health advice weekly, and one in ten use it daily, per a 2025 Zocdoc survey
  • ✓ 17% of US adults consult AI chatbots at least once a month for health information, but 56% lack confidence in the accuracy
  • ✓ A third of Americans would trust ChatGPT more than a human expert, according to a 2025 Survey Monkey and Express Legal Funding poll

In This Article

  1. Quick Summary
  2. Legal Professionals Face AI-Driven Clients
  3. Healthcare AI Usage and Concerns
  4. Privacy and Privilege Risks ️
  5. The Future of Professional Services

Quick Summary#

Generative AI chatbots are rapidly becoming the first stop for Americans seeking answers to complex legal and medical questions, functioning as a digital hybrid of WebMD and LegalZoom. This shift is fundamentally changing how people approach professional services and what they expect from doctors and lawyers.

Recent surveys reveal the scale of this transformation. In December 2025, legal software company Clio reported that 57% of consumers have used or would use AI to address legal questions. Meanwhile, health tech company Zocdoc found that one-third of Americans consult AI tools for health advice weekly, with one in ten doing so daily.

Legal and medical professionals are adapting to a new reality where clients and patients arrive armed with AI-generated research. While this democratizes access to information, it also creates friction as professionals work to correct misconceptions and rebuild trust. Privacy concerns loom large, as sharing sensitive case details or medical histories with consumer AI platforms may void legal protections.

Legal Professionals Face AI-Driven Clients#

Jonathan Freidin, a medical malpractice attorney in Miami, identifies ChatGPT users by distinct patterns in their communications. "A few times a week," Freidin observes, "I'll notice that people will fill out his firm's client contact sheet with text littered with emojis and headings. That's a telltale sign that they copied and pasted from ChatGPT."

Many clients claim they have "done a lot of research" using AI tools. Freidin notes, "We're seeing a lot more callers who feel like they have a case because ChatGPT or Gemini told them that the doctors or nurses fell below the standard of care in multiple different ways. While that may be true, it doesn't necessarily translate into a viable case."

Jamie Berger, a family law attorney in New Jersey, describes how the dynamic has shifted. Previously, clients knew little about divorce proceedings and sought information from attorneys. Now, they arrive with generic step-by-step plans that often don't fit their specific circumstances. Berger explains, "We have to dispel the information that they were able to obtain versus what is actually going on in their case and kind of work backwards."

She adds that AI usage changes communication patterns: "Berger will notice after emailing a client if their tone suddenly changes, that they might be using AI to write out lengthy legal strategies or questions." This requires rebuilding the attorney-client relationship in ways that didn't exist before.

"We're seeing a lot more callers who feel like they have a case because ChatGPT or Gemini told them that the doctors or nurses fell below the standard of care in multiple different ways. While that may be true, it doesn't necessarily translate into a viable case."

— Jonathan Freidin, Medical Malpractice Attorney

Healthcare AI Usage and Concerns#

Medical professionals report similar patterns. Oliver Kharraz, CEO of Zocdoc, predicts that "AI will become the go-to tool for pre-care needs like symptom checking, triage, and navigation, as well as for routine tasks like refills and screenings." However, he cautions that "patients will recognize that it is no substitute for the vast majority of healthcare interactions, especially those that require human judgment, empathy, or complex decision-making."

AI chatbots offer advantages that human providers struggle to match. They provide immediate responses without wait times and operate 24/7. Hannah Allen, chief medical officer at AI medical scribe tool Heidi, explains the appeal: "They really love that tempo of being able to know that ChatGPT never goes away, never goes to sleep, never says no, never says, 'sorry, your list is too long.'"

Some clinicians view AI as a useful second opinion tool. Heidi Schrumpf, director of clinical services at teletherapy platform Marvin Behavioral Health, says patients sometimes verify her advice with ChatGPT. "It's great that they have the access to a quick second opinion," she notes, "and then, if it doesn't agree with me, that allows them to ask me better questions."

However, confidence in AI accuracy remains low. A 2024 KFF poll found that while 17% of US adults consult AI chatbots monthly for health information, 56% of those users lack confidence in the accuracy of the information provided.

Privacy and Privilege Risks ⚠️#

Sharing sensitive information with consumer AI platforms carries significant legal risks. Beth McCormack, dean of Vermont Law School, warns about attorney-client privilege: "There's also a risk of voiding the kind of protections people get from the attorney-client confidentiality privilege if people put too much specific information about their case into a chatbot."

Medical privacy faces similar vulnerabilities. HIPAA, the federal law protecting confidential health information, does not apply to consumer AI products. Patients sharing entire medical histories with ChatGPT lack the legal protections they would have with healthcare providers.

McCormack emphasizes the complexity of legal matters: "There's so much nuance to the law. It's so fact dependent." This nuance is often lost in AI-generated responses, which may provide authoritative-sounding but generic advice that doesn't account for specific jurisdictions or circumstances.

OpenAI has acknowledged these concerns. The company updated its policies last fall, specifying that users cannot turn to ChatGPT for "provision of tailored advice that requires a license, such as legal or medical advice, without appropriate involvement by a licensed professional." However, the chatbot continues to answer health- and law-related questions.

The Future of Professional Services#

Despite the challenges, some professionals recognize AI's potential to increase access. Golnoush Goharzad, a personal injury and employment lawyer in California, notes that AI can help people who cannot afford upfront legal costs. For those facing eviction or needing to file small claims, AI tools have sometimes helped secure victories.

However, Goharzad also encounters friends who believe they have valid lawsuits based solely on ChatGPT's assessment. "She asks, 'Why? That doesn't even make any sense,' and they're like, 'well ChatGPT thinks it makes sense.'"

The consensus among experts is that resistance is futile. Rather than fighting AI usage, professionals should guide patients and clients toward responsible integration. Schrumpf advises: "We need to keep as clinicians in the back of our mind that this might be a tool that is being used, and it can be very helpful, especially with some guidance and integrating it into our treatment plans. But it could go sideways if we're not paying attention."

Allen reinforces this perspective: "Ultimately, you do need a human in there to understand the nuances of the communication and the softer communication skills, and the unspoken communication skills, and the entire medical picture and the history." As AI becomes ubiquitous, professionals must assume their clients and patients are using it, and adapt their practices accordingly.

"AI will become the go-to tool for pre-care needs like symptom checking, triage, and navigation, as well as for routine tasks like refills and screenings."

— Oliver Kharraz, Zocdoc CEO

"Patients will recognize that it is no substitute for the vast majority of healthcare interactions, especially those that require human judgment, empathy, or complex decision-making."

— Oliver Kharraz, Zocdoc CEO

"We have to dispel the information that they were able to obtain versus what is actually going on in their case and kind of work backwards."

— Jamie Berger, Family Law Attorney

"You have to rebuild or build the attorney-client relationship in a way that didn't used to exist. They don't realize that there's so many offshoots along the way that it's not a linear line from A to Z."

— Jamie Berger, Family Law Attorney

"They really love that tempo of being able to know that ChatGPT never goes away, never goes to sleep, never says no, never says, 'sorry, your list is too long.'"

— Hannah Allen, Chief Medical Officer at Heidi

"It's great that they have the access to a quick second opinion, and then, if it doesn't agree with me, that allows them to ask me better questions."

— Heidi Schrumpf, Director of Clinical Services at Marvin Behavioral Health

"There's also a risk of voiding the kind of protections people get from the attorney-client confidentiality privilege if people put too much specific information about their case into a chatbot."

— Beth McCormack, Dean of Vermont Law School

"There's so much nuance to the law. It's so fact dependent."

— Beth McCormack, Dean of Vermont Law School

"We need to keep as clinicians in the back of our mind that this might be a tool that is being used, and it can be very helpful, especially with some guidance and integrating it into our treatment plans. But it could go sideways if we're not paying attention."

— Heidi Schrumpf, Director of Clinical Services at Marvin Behavioral Health

"Ultimately, you do need a human in there to understand the nuances of the communication and the softer communication skills, and the unspoken communication skills, and the entire medical picture and the history."

— Hannah Allen, Chief Medical Officer at Heidi

Original Source

Business Insider

Originally published

January 7, 2026 at 09:17 AM

This article has been processed by AI for improved clarity, translation, and readability. We always link to and credit the original source.

View original article

Share

Advertisement

Related Articles

AI Transforms Mathematical Research and Proofstechnology

AI Transforms Mathematical Research and Proofs

Artificial intelligence is shifting from a promise to a reality in mathematics. Machine learning models are now generating original theorems, forcing a reevaluation of research and teaching methods.

May 1·4 min read
Google, Character.AI Settle Lawsuit Over Teen's Suicidetechnology

Google, Character.AI Settle Lawsuit Over Teen's Suicide

A lawsuit alleging an AI chatbot contributed to a teen's suicide has been settled, closing a closely watched case over AI accountability. Google and Character.AI have agreed to settle the US lawsuit.

Jan 8·4 min read
Zhipu AI Lists on Hong Kong Exchangetechnology

Zhipu AI Lists on Hong Kong Exchange

Shares of China-backed Zhipu made small gains on their Hong Kong debut, following a $558 million initial public offering. The listing marks a significant step for the domestic AI sector.

Jan 8·2 min read
Private Equity Faces Housing Market Backlashpolitics

Private Equity Faces Housing Market Backlash

Economic inequality and approaching elections are creating a political environment focused on private equity's role in housing. The movement seeks to address affordability and ownership issues.

Jan 8·4 min read