What Does ChatGPT Health Mean for Healthcare?
By Jay Kumar
Earlier this month, OpenAI launched ChatGPT Health, described as “a dedicated experience that securely brings your health information and ChatGPT’s intelligence together, to help you feel more informed, prepared, and confident navigating your health.”
Needless to say, this has prompted plenty of discussion. Here’s what several health industry experts had to say on the subject.
Heather Bassett, MD, Chief Medical Officer, Xsolis, an AI-driven health technology company fostering collaboration between healthcare providers and payers
Patients are already turning to ChatGPT for healthcare questions at a staggering rate, with OpenAI reporting 40 million daily users seeking health guidance. [In addition, recently] Anthropic launched Claude for Healthcare. Both these consumer-facing AI tools are an important step toward making patient interactions more secure and contextually relevant by connecting personal health data. But we must be clear-eyed about what this technology can and cannot do. Large language models can support patient engagement, but they cannot replace the lived experience and clinical judgment of physicians and nurses. Real-world medicine rarely resembles textbook medicine—it is nuanced, contextual, and deeply human. The real opportunity isn’t replacing clinical judgment but helping patients arrive better prepared for meaningful conversations. When patients better understand their conditions, they are better positioned to engage in shared decision-making with their care teams, supporting more appropriate care decisions over time without compromising outcomes. AI’s value in healthcare will be defined not by how independently it operates, but by how well it strengthens the clinical conversations where real decisions are made.
Fawad Butt, CEO and co-founder, Penguin Ai, a healthcare AI company that solves administrative burdens
I see the promise of ChatGPT Health, but I remain cautious about the tradeoffs, especially around trust and data stewardship. The core concern is security: we’re moving sensitive medical data out of governed, compliant environments into platforms where protections may be unclear and where users risk becoming the product, potentially fueling model training. Until transparency, consent, and true healthcare-grade safeguards are non-negotiable, this is very much a buyer-beware moment.
Jay Anders, MD, Chief Medical Officer, Medicomp Systems, a clinical intelligence company that unlocks the true value of clinical data
ChatGPT has now entered the health data exchange arena. OpenAI states that the information is protected; however, there is no oversight or written commitment between the patient/user that this will continue in the future. Users need to understand that there is NO recourse should OpenAI decide to use their data in any way they see fit. Privacy laws concerning the use and storage of PHI need to be applied here to mitigate the unregulated use of patient data. Words are just that … words.
Mark Pratt, MD, Chief Medical Officer, Altera Digital Health, a global health IT company
AI-powered health tools like ChatGPT Health represent a meaningful step forward in patient engagement and health literacy. When patients arrive better informed—with questions prepared and a clearer understanding of their lab results—it elevates the entire care conversation. Yes, this will challenge physicians to engage differently, but that’s a challenge worth embracing. More informed patients lead to better outcomes. The key is perspective: AI is a powerful tool for understanding, not a replacement for the clinical expertise, judgment and relationship that only a healthcare professional can provide. Use it to learn. Then talk to your doctor.
Greg Farnum, SVP and GM, Audacious Inquiry, which provides on-the-ground expertise in health information systems, public health information systems, and health information exchange
This is an exciting development, and I’m hopeful that others will continue moving into the health market, keeping in mind the principles of sustainable AI deployment that balance patient trust and the physician-patient relationship while accelerating toward more efficient delivery and access to care. When patients have greater access to their medical records, they are better able to engage with their providers and collaborate on the care they receive This development will undoubtedly provide features in patient-facing fitness and prevention apps, as well as traditional clinical tools, making it more appealing for youth to engage early in their healthcare, a critical component as we move toward preventive care.
Steve Buslovich, MD, CMO Senior Care, PointClickCare, a leading health tech company helping providers deliver exceptional care
The introduction of ChatGPT Health highlights how quickly AI is becoming part of everyday health decision-making, but also raises concerns about trust, accuracy, and clinical context. In senior care, AI must support informed decisions and contextualize risk for the individual while accounting for the complexity of older adults and the interdisciplinary teams who care for them. The greatest value comes from AI that is thoughtfully designed, grounded in high-quality, validated, longitudinal data, and used to reinforce professional interpretation while preserving the human connection that remains central to caring for vulnerable populations.
Anita Phung, Research Physician, Lindus Health, the ‘anti-CRO’ running radically faster, more reliable clinical trials for life science pioneers
ChatGPT Health represents a significant opportunity to scale clinical decision support globally—especially in resource-limited settings where better diagnostic support could meaningfully improve patient outcomes. But as recent preprint work involving OpenAI and Penda Health shows, the field urgently needs the same rigorous evidence standards applied to pharmaceuticals: pre-registered protocols, transparent conflict-of-interest disclosures, and peer-reviewed validation before claims of widespread deployment. The real challenge isn’t whether AI can transform healthcare, but whether we will build the proof stack and regulatory frameworks needed to ensure these tools are clinically effective and safe at scale.
Andy De, chief marketing officer at Lightbeam Health Solutions, a company offering population health enablement technology and solutions
Patients are increasingly comfortable researching health information on AI platforms like Chat GPT, as well as for self-service pertaining to appointments, scheduling, diagnosing symptoms and treatment, and searching for providers. The new healthcare research capabilities are aligned with this trend and will empower patients with valuable information for personal health management.
Brian Kenah, Chief Technology Officer at EnableComp, transforming complex claims, denials, and revenue recovery through its AI-driven e360 RCM® platform and specialized expertise
The launch of ChatGPT Health — and the fact that more than 230 million people are already asking health questions through ChatGPT every week — signals that AI in healthcare has reached unprecedented scale and maturity. As consumers become more comfortable connecting medical records and wellness data to tools like this, healthcare organizations will increasingly expect the same level of seamless integration and intelligence on the enterprise side. What OpenAI is demonstrating is that AI can navigate healthcare’s complexity: data interoperability, regulation, and personalization at scale, reinforcing the idea that AI-driven platforms are better suited than labor-intensive models for managing complex healthcare operations. The rapid expansion of connected health data platforms shows the infrastructure is finally ready, making advanced healthcare intelligence not just possible, but inevitable.
Steven Lane, MD, Chief Medical Officer, Health Gorilla, a designated QHIN under TEFCA
ChatGPT and now Claude announcing healthcare specific applications based on their powerful AI infrastructure represent revolutionary new opportunities to leverage health data to improve care. These new tools are dependent on low friction access to as much high-quality health data as possible, so we need to keep pushing ahead on the persistent challenges of clinical documentation, secure, privacy-protected exchange, and health data quality and usability.
Greg Miller, VP of Business Development and Marketing, Carta Healthcare
We are excited to see OpenAI invest in healthcare through the launch of ChatGPT Health. We view it as an important signal that AI can meaningfully improve how clinical information is accessed and used. Importantly though, good AI alone is not good enough. Clinical use cases demand strong governance, transparency, and human oversight to ensure trust, safety, and real-world reliability.