Risky Business

Submitted by ADonahue on
Group of nurses outside holding signs "Patients are Not Algorithms" and "Trust Nurses, Not A.I."

Hospitals are embracing artificial intelligence to replace us, but nurses know data points are not true nursing care. How we’re fighting back for our patients and our profession.

By Lucia Hwang

National Nurse magazine - April | May | June 2024 Issue

As all registered nurses know, one of the most critical components of starting a shift off right is getting a good report at hand-off from your colleague. The previous shift’s nurse will give you the inside scoop on how to best care for your patient: labs you are waiting on and which values to hone in on, whether medication got shifted back two hours because pharmacy screwed up the order, a special happy memory to make changing that wound dressing a little more bearable, or communicating potential workplace violence risks. 

But at University of Michigan Health-Sparrow in Lansing, Mich., where RN Jeff Breslin works as a float pool nurse, the hospital over the past few years has instituted automated hand-offs. With automated hand-offs, the electronic health record (EHR) system in which nurses chart pulls together sections of the chart and highlights certain data for the next shift’s nurse; no human-to-human communication happens. Sparrow’s EHR system, Epic, brags on its corporate website that “With Epic, Generative A.I. seamlessly integrates into your Electronic Health Record (EHR)... see how A.I. personalizes patient responses, streamlines handoff summaries, and provides up-to-date insights for your providers.”

Breslin is not impressed. He has seen that the automated hand-off reports often omit critical information or overstate the importance of other data. Usually, he intervenes first before any harm is done, such as when he hunted for info about an ordered medication for a patient that was charted in a different area than expected. “Documentation in the med record was missing, so if I would not have caught it elsewhere in the chart, I could have given a dose that could have been harmful,” said Breslin, an experienced nurse who has worked at Sparrow for 29 years. “I did end up catching that because knowing what the patient was in for, and that the normal treatment would involve certain drugs, I dug deeper into the notes and saw that they had documented it -- just not in the proper place in the chart, and it was not contained in the electronic hand-off.”

The automated hand-offs, to Breslin, are not useful and cannot replace a nurse-to-nurse report. “It’s better to have an actual conversation with someone who has taken care of the patient,” he said. “It takes some of the guesswork out of receiving a new patient by having that face-to-face, human-to-human interaction.” And that’s exactly what the Sparrow nurses are fighting for, to restore in-person -- whether face-to-face or by telephone -- report as the norm.

Automated hand-offs powered by A.I. algorithms are just one example of the flood of new technologies -- hardware and software, programs, platforms, apps, cameras, screens, you name it -- that our hospital employers are implementing in our workplaces. They claim they are doing so to “increase efficiency” and “help out” us busy nurses during a so-called “nursing shortage,” but we know from experience that they have a less altruistic goal: to increase profits by cutting nursing staff, removing us from direct care, and deskilling and degrading our professional practice. As many of the nurses interviewed for this story point out, “efficiency” is a business goal, not a nursing goal. And we know the “nursing shortage” is a lie: 1.3 million actively licensed RNs are not working because they refuse to practice under the unsafe conditions hospitals are providing. Finally, we know A.I. has shown harmful biases against Black, Indigenous, and other people of color because it draws from data generated by an already prejudiced health care system.

In a winter 2024 survey of 2,300 RN members about A.I. and technology, National Nurses United (NNU) revealed disturbing experiences. About 40 percent reported that their employers had introduced new devices, gadgets, and changes to the EHR over the past year, but 60 percent disagreed with the idea that their hospitals would prioritize patient safety when implementing them. About half said their facilities use A.I. algorithms to determine patient acuity, but about two-thirds of nurses said the computer-generated acuity measurement did not match their real-world assessment. Almost a third, 29 percent, said they could not override the system when they disagreed with determinations and categorizations generated by the A.I. software. 

As Big Tech and Big Health seek to adopt these technologies at breakneck pace, NNU believes it is imperative that our union must proactively help nurses educate themselves and the public about them, organize and fight back when they threaten patient safety and nursing practice, and participate in development of legislation and policy to regulate their use. Unlike a new drug or treatment that must be tested and pass the Food and Drug Administration for safety and efficacy -- even compared to a placebo or the status quo of doing nothing -- almost all A.I. currently does not undergo any kind of testing, verification, or validation.

To that end, NNU has developed a “Nurses and Patients’ Bill of Rights: Guiding principles for A.I. justice in nursing and health care,” a document that will serve as a touchstone for evaluating upcoming policy and legislation to regulate A.I. These rights include the right to high-quality, person-to-person nursing care; right to safety; right to privacy; right to transparency; right to exercise professional judgment; right to autonomy; and the right to collective advocacy for workers and patients. See the sidebar and NNU’s website for more details. 

“It’s actually incredibly shocking when you consider that hospitals are deploying these technologies immediately into real-life patient care health settings where people’s lives are on the line and actual harm can be done to patients,” said Michelle Mahon, RN and assistant director of nursing practice at NNU. “Patients are not guinea pigs and most companies are offering no proof that their products are safe or effective.”

Twonurses outside holding signs "Patients are Not Algorithms" and "Trust Nurses, Not A.I."

What exactly is A.I., the acronym for “artificial intelligence”? It has meant different things at different times, but the definition that NNU uses now is as follows: A.I. is a machine-based technology that attempts to mimic human intelligence by taking inputs in the form of massive amounts of data, processing them through algorithmic software, and generating outputs from that process. Those outputs may be in the form of predictions, content, recommendations, directions, or decisions. In the health care context, A.I. often analyzes and generates recommendations or other conclusions based on patients’ electronic health records and other sources of data collected from patients, health care workers, and the environment.

This type of data aggregation and processing has been going on for decades, but what has catapulted A.I. into the headlines lately is the phenomenon of “generative A.I.,” where a user can prompt the computer or system to draw upon the data available to it in order to produce what appears to be an original piece of content, whether an essay, piece of artwork, or a nursing care plan.

The goal, as always, is profit. Nowhere was this more blatant than in a March 2024 announcement by software company Nvidia that was partnering with Hippocratic AI to offer health care institutions generative A.I. “nurses” that cost only $9 per hour to operate. In its promotional video, Nvidia shows an A.I. “nurse” named Rachel interacting with a patient who has just been discharged home after an appendectomy. They discuss medications and the avatar on the tablet offers some general recovery education. 

Currently, A.I. is already being used in software systems that determine staffing and scheduling, clinical prediction, remote patient monitoring, automated charting and nursing care plans, and more. A.I. also makes possible many of the programs that on first blush would not appear to be related to A.I., such as tele- or remote-sitting and hospital at home, because it is largely relying on the A.I.-driven software to alert humans when something is wrong.  

One of the most frustrating aspects for nurses of confronting A.I. on the job is the lack of transparency about when A.I. is in use. Does this scanner use A.I.? Does that pump that’s always squawking alerts use A.I.? Employers usually do not disclose or make announcements about the deployment of A.I. technologies, even though they should and, in many of our hospitals, are contractually obligated to do so and bargain over their effects. 

But if you browse the websites for the companies creating these technologies, many very clearly tout that their product integrates A.I. It’s probably safe to say that almost every nurse is currently encountering A.I. via their interaction with whichever electronic health records system their hospital uses. Nurses’ documentation feeds directly into the EHR and patient classification systems that determine acuity. The two big players that share market dominance in EHR management are Epic and Oracle (formerly Cerner), whose product name is Clairvia. 

The main problem that nurses report with these EHR systems is that they misclassify patients at a lower acuity than they actually are, thereby leading to lower staffing levels than are actually needed to do the care work and keep patients safe. Every four to five hours, the system analyzes whatever the nurses have charted and uses that to predict what staffing should be for the next shift. However, nurses are often not able to chart in real time when their first priority is providing safe patient care, and also point out that certain aspects of the care they provide simply are not or cannot be accurately captured by the EHR. 

For example, Kaiser Permanente nurses report that nursing care hours for treatments like continuous bladder irrigation or intravenous immunoglobulin, which require nurses to be constantly monitoring and entering the patient’s room frequently, are not properly accounted for within Epic. Pediatric oncology RN Craig Cedotal noted that there are many hours of preparation, checking, and double checking by a second nurse that happens for kids receiving chemotherapy before the patient even steps foot in the facility. This work, and the time it takes to do it, is not accounted for by Epic in determining staffing levels. 

“I don’t ever trust Epic to be correct,” said Cedotal, who works at Kaiser Permanente Oakland Medical Center. “It’s never a reflection of what we need, but more a snapshot of what we’ve done.”

This type of “snapshot” staffing is particularly inappropriate for certain kinds of units, said Allysha Shin, an RN in the neuro ICU at Keck Medicine of USC and a California Nurses Association/National Nurses Organizing Committee board member. “Our unit is always very busy,” said Shin. “We can easily get six to eight admissions in one shift. Sometimes they are emergency admissions from outside hospitals or even a code stroke that happens in house.” But since her hospital transitioned to Clairvia about six months ago, the nurses noticed that resource nurse hours are being whittled away because the system is saying the nursing demand hours don’t justify the staff. When they argued with the hospital, management said, “No, we are only going to staff for what we have at the moment in real time.”

Lastly, and arguably most importantly, nurses say that these EHR systems do not account for the time it takes nurses to provide the education and compassionate,  psychosocial care to patients and their families that is absolutely critical to their well-being and successful healing. “It’s about you putting your hands on a patient and just loving them and showing how much you care, how much you value them,” said Aretha Morgan, an emergency pediatric RN at NewYork-Presbyterian hospital in Manhattan and a New York State Nurses Association board member. “You cry when they’re sad. You’re happy when they’re getting healed. You’re the one that when they’re about to fall, you’re catching them. We should not ever ascribe to any type of artificial intelligence that moves us away from taking care of our patients. Our patients are number one.” 

As Deb Quinto Capistrano, a Kaiser Permanente San Francisco RN who works in a medical-surgical-telemetry-stroke unit, notes, she is continually providing patient education during the patient’s stay, but the Epic acuity system gives her no credit for this work that should translate into more time and better staffing. “We do patient education from the start of admission through their time in the hospital, but we only get education points when they have an active discharge,” said Capistrano. 

Conversely, aspects of the EHR systems that are supposed to proactively help nurses often do the opposite: make nurses spend valuable time responding to false “advanced alert monitoring” or “early warning system” alarms for conditions such as sepsis. For Melissa Beebe, an oncology RN at UC Davis Medical Center in Sacramento, Calif. and a member of CNA/NNOC’s joint nursing practice commission, the sepsis alerts generated by their Epic EHR are often incorrect. The system will flag patients as likely sepsis cases when Beebe knows in her professional judgment that this is extremely unlikely, and miss the ones whom she knows truly are in danger and she is already acting upon. “Everybody on my unit knows the sepsis warning system is not helpful,” said Beebe. “I’ve sent people to ICU and that thing never went off.” 

Beebe’s experience matches the published research: One A.I. Early Warning System (EWS) analyzed patient data with the goal of identifying patients with a substantial risk of developing sepsis. The EWS was widely implemented at hundreds of hospitals throughout the country. However, when this sepsis EWS underwent external validation, researchers found that the program missed over 67 percent of sepsis cases. The authors of this study, published in August 2023 of JAMA Network Open, concluded of the EWS that “it appears to predict sepsis long after the clinician has recognized possible sepsis and acted on that suspicion.”   

These faulty alerts are not just merely an annoyance, though. They take real time away from needed patient care. E.C. Mitchell, an ICU nurse at Kaiser Permanente in Modesto, Calif. who also serves on his hospital’s rapid response team, noted that he could spend close to an hour following up on false alerts. “This is a waste of time when I could be checking in on the really sick patients, or doing actual procedures,” said Mitchell. “I have to call the attending physician to respond to it, and then we both write a report on it. All these things are taking up time. The bigger deal is that it doesn’t really catch the problem patients, so it is a false sense of security. I find the really sick ones by checking in on the nurses, by asking them, ‘Any concerns about your patients?’”

Besides EHRs, nurses also report new programs and initiatives such as Qventus, which is a software platform for automating discharge planning processes to reduce length of stay, and Desktop Medicine, where Kaiser uses an A.I.-powered system to categorize patient messages to “[assist] regional staff in resolving about a third of the messages so they never reached the inboxes of busy doctors,” according to Kaiser’s own website. Nurse members who work in areas where Desktop Medicine has been implemented, however, note that the messages that do not pass through to a physician get diverted to a medical assistant, who then determines whether the message should go to a registered nurse. Heather Aguirre, a charge RN who works in women’s health at Kaiser Permanente in Napa, Calif., noted how this process was wrong on multiple levels. 

“So my number one concern: What is the algorithm looking for initially?” said Aguirre. “And then next, an MA should not be doing triage. They should not be looking at messages to decide if it goes to a doc or nurse; they are not trained to do that. Nurses triage. So there should be a layer of nurses who review messages at the start.” 

But that’s not what’s happening.

Group of nurses marching holding signs "Patients are Not Algorithms" and "Trust Nurses, Not A.I."

Morgan, the New York City nurse, said it’s so frustrating to watch our hospitals dump hundreds of millions of dollars into all this unproven technology when a very simple and scientifically proven solution is and has always been immediately available: invest in safe staffing. “I feel like they are racing to put a man on the moon when we just need more nurses,” said Morgan, who also teaches nursing. 

So NNU nurse members are ramping up the long-haul campaign to reframe the conversation on A.I.: The public, nurses and other providers, and regulators need to hit pause on use of these technologies; question whether they are even needed and who benefits; and demand that the burden of proving A.I. and other data-driven tech are safe, effective, and equitable be placed on developers and employers before they are deployed in health care settings. 

In what perhaps may be the first public protest of A.I. tech by registered nurses, hundreds of RNs rallied and marched on April 22 in San Francisco outside a Kaiser Permanente’s International Integrated Care Experience conference that featured the system’s use of advanced analytics and A.I. 

“We’re all for tech that enhances our skills and the patient care experience,” said Bonnie Castillo, RN and executive director of NNU and CNA/NNOC to a boisterous crowd of nurses holding placards reading “Patients are not algorithms” and “Trust nurses, not A.I.” “But we won’t stand for employers devaluing nursing care — and breaking our solidarity with our patients and each other — to boost profits by replacing skilled nurses with technology.”

And on May 9, during Nurses Week, nurses from around the country held a briefing for media to warn about the dangers of A.I. for patient safety and the integrity of their profession. RNs shared stories about how A.I. hurts, not helps, their work and fundamentally questioned whether A.I. belongs in health care at all. 

“Rather than worrying or being focused on how we make A.I. work, is that we question the values,” said Mahon during the briefing. “Why is A.I. here? Who is bringing it? What is behind this? Is it truly in the best interests of people? Is it providing human-centered care? Does it respect the values of dignity, human respect, and quality of life? This has in large part been overlooked by the hype that A.I. is inevitable.” 

Along these lines, NNU will be urging Congress to institute a regulatory process for ensuring that A.I. technologies are safe and effective, just as the Food and Drug Administration does for medications and medical devices. 

And, of course, NNU nurses continue to challenge A.I. in their facilities. Many of our contracts include technology language provisions that give nurses tools to push back against harmful use of technologies, including A.I. The hard work is in educating and organizing nurses to take collective action.

That is exactly what the nurses at Keck Medicine of USC did, though. When their employer announced that it would be instituting the Clairvia acuity system, the nurses were concerned that there was a lot less clarity and transparency over how patients were classified compared to their old system, which gave clear examples of where patients would fall in acuity and allowed nurses to add in reasons why they felt the patient should be designated at a higher acuity. 

“We demanded to bargain,” said Shin, who is a professional practice committee member. “We wanted a commitment from the hospital, in writing, that you are going to leave responsibility for patient  classification and acuity up to the bedside nurse. We were not going to relinquish this responsibility or authority.”

The nurses stood firm on this language and were preparing to stage a picket of the hospital when management finally issued a memo stating that only a licensed registered nurse could make the final decision about nursing care hours needed. 

Now that the nurses have this strong language, it is up to them to enforce it and push back, said Shin. “If a situation comes up, we’re going to have to say, ‘No, we were told it is up to us to determine patient acuity and I am telling you this person needs to be one-on-one and not paired.’ This is a way to empower nurses so that they know that, ‘No, I have the right to say that this patient is a higher acuity than what Clairvia says it is.’”

“Technology is going to be an ongoing issue for nurses,” said Shin. “It’s easy to get sucked into technology. I mean, I love my phone, my apps, but how much are you giving up in exchange for using technology? That’s the question.”


Lucia Hwang is editor of National Nurse magazine. Rachel Berger contributed reporting to this article.