Preparing PAs for AI and Beyond

By Matt Phillion

A recently released report from Wolters Kluwer, Future Forecast: The Growing Impact of PAs in the Healthcare Ecosystem, takes a deep dive into how physician associates (PA) are dealing with rapid changes in healthcare, particularly in the area of AI use.

According to the survey:

  • 56% of PAs surveyed report that they use AI daily
  • 19% report they rely on AI extensively
  • 87% acknowledge the need to learn more about AI, with 83% expressing a desire for more formal, employer-led training
  • 89% say that AI will dramatically change PA practice, but only 32% report having clear workplace guidelines for safe use of AI

“We decided to focus on PAs because it’s one of the fastest growing areas of the industry. There are surveys about academic use of AI, but it’s growing and changing so quickly that we wanted to get a pulse check with regards to this profession we are focusing on,” says Kelly Villella, Director, Medical Education & Medical Practice at Wolters Kluwer Health. “PAs and AI are both transforming healthcare right now, and it’s interesting to see that intersection.”

It’s also a helpful reminder that PAs are grappling with the same challenges all healthcare professionals are facing—all industries, in fact, during the AI revolution, Villella notes.

“We need to note that 87% said they need more training,” she says. “It’s a predominant topic at every education conference, every healthcare conference, and we need to zero in what this looks like, what the preparation we need in education looks like, what kind of onboarding and governance is needed.

The survey also found that PAs are most looking forward to tools that can help with productivity that enable them to put more of their focus on the patient.

Guidance and trust

Just under a third of respondents said they had clear guidance on using AI solutions in their organizations, and that lack of clarity can open up potential risk, Villella explains.

“Shadow AI is a risk every organization is facing,” she says. “There are all kinds of solutions we have access to, but we went to make sure we’re using the right ones. This starts at education, guiding and training faculty and students to use vetted, evidence-based solutions that their organization already trusts.”

Organizations need to quickly look at partnering opportunities to get these tools embedded in trusted solutions so you don’t have PAs or other providers branching out to using non-trusted, non-authorized tools they believe will make their lives easier.

“If people aren’t explicitly given the guidance they need on how to use AI so it’s safe” that increases risk to the organization, Villella says. “You don’t want to put patient information into an open-source solution, for example.”

One of the core questions is just how an organization can integrate new technology into a robust curriculum, and where can you focus to create the best possible patient care with that integration.

“I just saw a PA for one of my own appointments, who had a PA student. She was telling me that the documentation systems are different at every place she works. There’s a lot of opportunity for efficiency and more focus on the patient,” says Villella. “Documentation can be a burden. If clinicians are focusing on entering the patient notes, they’re less focused on the patient.”

It’s going to require both academic and healthcare organizations working on integration of emerging technologies, adopting AI tools from reputable sources, and focusing on preparing students for appropriate use, Villella explains.

“That’s what’s critical now. Everyone’s catching up,” she says.

Healthcare already has its fair share of regulations regarding confidentiality and data, Villella points out. But the industry will need tweaks and clarifications to address emerging technologies.

“I’m not sure it has to be a national regulation so much as each organization has to develop and communicate its AI policies very quickly,” Villella says. “I think it’s going to get far more concrete as they’re adopting technology that offers these tools.”

It’s more than just having a conversation, as well. The organization needs to have the tools already vetted before they are adopted.

“Even in my own organization, for example, we can’t use outside tools. They need to be vetted by security. That’s what’s going on with healthcare, and that’s not the job of the PA,” she says.

Not every PA, or every working professional in general, will even know what shadow AI is or means.

“We need to help them,” says Villella. “With a revolution like AI, we need to be surveying more often, taking quick pulse checks. Every academic year, programs will evolve and incorporate new tools that leverage AI for learning and clinical use cases, which leads to more onboarding, more best practices, more learning about risks and advantages.”

One benefit PA education has is that the educators are practitioners themselves. Nearly all faculty are also out delivering care. They offer first-hand experience to share with incoming practitioners.

“It takes time for an organization to move, but there’s a lot of awareness, advocacy, and focus,” says Villella. “The next few years, the evolution is going to be rapid, but what do these tools and processes look like in three years? We won’t know until they are embedded into the workflow itself.”

An adaptable profession

Faculty educators have experience and expertise in protecting patient data and what the challenges in those areas are, Villella notes. As the guidance and the tools themselves continue to evolve and shift, staff will need to be adaptable and flexible—something PAs excel at, Villella explains.

“They’re already trained to be adaptable,” she says. “They’re serving in a variety of different specialties and are transformative in healthcare because there are many areas, specialties, regions that are facing physician shortages. PAs are well-prepared for flexibility. It’s built into their training.”

The two fastest growing professions in healthcare are PAs and nurse practitioners and that ability to adapt is one of the reasons for this, Villella explains.

The survey found that PAs feel prepared for the workplace: 95% said they felt their education prepared them for the workplace, though 23% said they were less confident about being prepared for the documentation component, and 20% said they were less prepared about prescribing medications.

One area where AI is being used but needs more guidance is as a research tool, Villella notes.

“Speaking broadly, you could go to Google for research, but we highly do not recommend that as your source of truth,” she says.

The same goes for AI that has not been approved as an evidence-based, trusted solution. “When properly integrated into those trusted systems, technology can amp up the speed and help with synthesis, provide links to sources. There’s a difference between using an open source and having it embedded in a system where it’s only drawing from evidence-based information. It’s crucial that PAs and PA educators get themselves educated on the AI features and functionalities in systems they already trust.”

There are tools out there using AI well: radiology tools assisting with identifying evidence of cancer cells, for example. But it’s important to understand the guidelines, parameters, and pitfalls of any given tool before using it.

“We need evidence-based, expert solutions. That’s the way to go,” Villella says. “Open is not safe. Do you want to put your trust there when your patient’s health is at risk?”

Villella looks forward to a time when these technologies help with efficiency, documentation, and improved workflow.

“I’d love to see a few years from now when PAs can say, ‘Yes, I’ve been trained on new tools powered by AI, I’m comfortable with it, and it’s making me more efficient,’” says Villella. “We can all be excited about the possibilities and potential, but wary of the pitfalls.”

Matt Phillion is a freelance writer covering healthcare, cybersecurity, and more. He can be reached at matthew.phillion@gmail.com.