Trump and Kennedy Seek To Relax Safeguards for AI Healthcare Tools
Paul Boyer, a psychotherapist for Kaiser Permanente in Oakland, California, is experiencing the AI revolution firsthand. He鈥檚 a little underwhelmed.
The health giant has rolled out a new suite of note-taking software, made by healthcare AI pioneer Abridge, intended to summarize a patient鈥檚 visit at supersonic speed. For many clinicians, the technology soothes one of the persistent headaches of their lives 鈥 administration and paperwork.
But the AI scribe caused another headache for Boyer and his colleagues: It is 鈥渘ot super useful.鈥 They end up correcting the computer-written notes.
Abridge is 鈥渘ot good at picking up on clinical nuance, at picking up on the emotional tone鈥 that can be critical in the mental health field, Boyer said. For example, for manic patients, what鈥檚 said is less important than how it鈥檚 said, Boyer said, and the software struggles with picking up on those cues.
Note-taking software isn鈥檛 the wave of the future; it鈥檚 the wave of the present. Hospitals nationwide are implementing it. And researchers are finding some benefits. A year after installation, doctors who used these products the most saved more than half an hour of work daily, according to published in April in the Journal of the American Medical Association.
Many doctors love the products where they鈥檙e deployed 鈥 several to the scribes.
Nevertheless, as Boyer鈥檚 example shows, there are persistent questions about the systems鈥 quality. While Boyer and his colleagues spend time correcting notes, safety researchers worry clinicians might not be diligent about catching errors. That might mean future doctors rely on bad information.
Abridge says it evaluates its scribes at every stage of deployment, including with head-to-head tests against previous versions of the software.
鈥淔ollowing deployment of a model, we monitor clinician edits, star ratings, and free-text feedback from clinician users about note quality,鈥 the company鈥檚 director of applied science, Davis Liang, told 麻豆女优 Health News in a statement.
Artificially intelligent scribe software is part of a swarm of AI-powered tools coming to healthcare. Clinicians and patient-safety advocates say government regulations are not well constructed to guard against the threat that the new technology will miss or obscure important details of patients鈥 conditions, potentially harming them.
鈥淭here is currently no safeguard in place鈥 to vet scribe software at the federal level, said Raj Ratwani, a researcher specializing in human factors 鈥 that is, how people interact with technology 鈥 at MedStar Health, a large hospital system based in Columbia, Maryland.
Ratwani worries that safeguards on health software will relax even further. from the Office of the National Coordinator for Health IT 鈥 the body that regulates electronic health records, the central chronicle of care for patients 鈥 could weaken requirements to make medical records understandable, easy to use, and transparent about the use of AI, Ratwani said. And an incomprehensible record could confuse clinicians and lead to errors.
Beginning in the Obama administration, the Health and Human Services Department鈥檚 IT office , in which developers try their products on doctors and nurses. Regulators also sought to require more transparency from companies in the surging market in AI tools.
Both of those requirements are axed in the proposed rules from HHS Secretary Robert F. Kennedy Jr.鈥檚 health IT office.
Doctors and other health practitioners consult records for clinical information, such as scribe notes summarizing the history of patient care and lists of drugs and therapies their patients have used. Doctors also input orders for care.
Poor or cluttered design of a records system 鈥渕ight make the list of medications so complicated and confusing that the ordering provider selects the wrong medication,鈥 Ratwani said.
Abridge鈥檚 general counsel, Tim Hwang, said the company 鈥渂roadly supports鈥 the government’s rules as a 鈥渘ecessary modernization鈥 that 鈥渁ccommodates the speed at which AI is evolving.鈥
The old rules 鈥減ut way too much burden鈥 on electronic health record systems, said Ryan Howells, a principal at Leavitt Partners, which consults for digital health companies. Leavitt supports the proposals.
Dropping requirements, the administration argues, will result in more innovation and competition. The electronic health record market has steadily consolidated, with hospitals and other clinicians choosing from fewer vendors.
A 2022 study found the top two vendors, Epic and Oracle Health, of the hospital market. And Howells argued too many rules burdened providers looking for good record systems. Federal regulations, Howells said, are 鈥渢he single biggest inhibitor to true clinical innovation.鈥
The Trump administration proposal to remove requirements governing records is overbroad, some critics say. It removes regulations intended to keep records secure. It also eliminates privacy protections for sensitive medical data they safeguard, overhauls standards governing the formats data is sent in, and more. The rule may give clinicians 鈥渕ore health IT choices to meet their needs through increased competition,鈥 the government wrote in its proposal.
HHS鈥 health IT office declined comment, noting the proposal is still winding through the regulatory process. Public comment closed in February.
But most concerning to some 鈥 even in the hospital and developer sectors 鈥 are proposals to scotch prerequisites to ensure new products are tested on actual users, and to ensure AI tech鈥檚 decisions are transparent to doctors and nurses.
鈥淗istorically, hospitals and health systems have been challenged by the black box nature of certain AI tools and how the algorithms are developed,鈥 the American Hospital Association鈥檚 Jennifer Holloman said. And with more AI tools flooding the market, the association , transparency is even more critical.
Complaints about the safety of electronic health records are long-standing, even for seemingly straightforward tasks. Ratwani likes the example of ordering medication for a given condition.
鈥淭he physician is trying to order Tylenol, and the medication list can be so confusing that there’s 30 different versions of Tylenol all at a different dose and for different purposes, when in reality that could be designed much more simply and make it easier for the physician to actually pick the right type of Tylenol that they’re ordering,鈥 he said.
Real-world user testing was intended to simplify record design for doctors. But the administration is ending that requirement in a confusing way, said Leigh Burchell, vice president for policy and public affairs at Altera Digital Health, an EHR developer.
In Burchell鈥檚 interpretation of the rules, which refer to 鈥渆nforcement discretion,鈥 a principle in which the government can opt not to enforce certain rules, companies are still required to do the testing 鈥 the part that takes work 鈥 but are not mandated to report their results to the feds.
The administration is also ending a Biden-era idea to create AI transparency 鈥渕odel cards.鈥 The concept was that clinicians could explore the data used to train AI tools that advise clinicians with a simple mouse click. But few took advantage of the year-old tool, Trump鈥檚 regulators say.
Still, hospitals and doctors are wary of removing it. The tool 鈥減rovides information on how a predictive or generative AI application was designed, developed, tested, evaluated and should be used. These data are critical to foster trust in AI tools and ensure patient safety,鈥 the AHA wrote in a comment letter to the HHS IT office. The American College of Physicians , saying a 鈥渓ack of clarity could undermine clinician trust, increase liability expense, and erode the patient-physician relationship.鈥
Even developers aren鈥檛 totally sure about the idea. Burchell said the electronic health records trade group she鈥檚 part of had 鈥渁 lot of different perspectives鈥 on the issue. 鈥淣ormally, we tend to be a bit more aligned on our responses.鈥
Still, Burchell鈥檚 group thought companies should be transparent about the data AI relies on to make decisions and how it comes up with recommendations.
Evidence for AI tools鈥 effectiveness or contradictory.
A comparing 11 AI scribes for potential use as a pilot in the Veterans Health Administration found the software performed worse than humans across five simulated scenarios. 鈥淎lthough ambient AI scribes can generate complete notes, the overall quality remains broadly below that of human-authored documentation,鈥 the authors noted, with the omission of information being particularly concerning, given the potential to affect follow-up care.
The vendors in the VA study weren鈥檛 identified, for what the authors called 鈥渃ontractual reasons.鈥
And that鈥檚 just one type of AI tool. A wave of them is coming, each needing its own evaluation, to say nothing of tools that have already been installed.
Boyer said he can mostly ignore his AI scribe, for the moment. But he worries that management will design his job around the expected time savings and schedule more patients 鈥 meaning he鈥檇 need to spend more time both with patients and correcting the software鈥檚 errors.
A KP spokesperson, Vincent Staupe, said the company does not require its clinicians to use AI.
鈥淲hen I am correcting that note, I feel like this is too much work,鈥 Boyer said. 鈥淭his is definitely making this worse, and this is taking up time that I need to not be spending on correcting an AI tool.鈥