Veterans Affairs Nurses Management Scrutinized After Patient Deaths in Two States (Norman Costa)
A re-editing of an article by Tracy Weber and Charles Ornstein
ProPublica, April 30, 2012, 1:19 p.m.
After a patient died last year at a Veterans Affairs hospital in Manhattan, federal inspectors discovered nurses in his unit [Management] had a startling gap in their skills [supervision and training of nursing staff]: They [Management] didn't understand [they were responsible for the competence and training of nursing staff as to] how the monitors tracking vital signs worked.
None of the nurses [management] interviewed could accurately explain [their lapse in supervision and training of nursing staff, so that nurses could tell] what would happen if a patient became disconnected from a cardiac monitor — which allegedly occurred to the patient who died, according to an October 2011 report from the U.S. Department of Veterans Affairs' inspector general.
The incident followed two deaths in the cardiac monitoring unit at a VA hospital in Denver that raised similar questions about nurse [Management] competency.
Earlier this month, a broader review by the VA inspector general of 29 VA facilities found only half had adequately documented that their nurses had the needed skills. Some nurses "did not demonstrate competency in one or more required skills," but there was no evidence of [Management taking charge and providing the] retraining [for nursing staff], the report said.
An outside nursing [management] expert who reviewed the reports at ProPublica's request called them "troubling" and said the fact that the [management] lapses weren't caught and corrected "signified much broader [management] problems."
The inspector general's findings reveal [the management showed] "a lack of oversight and adherence to accepted clinical and regulatory standards," said Jane Hirsch, a clinical professor emeritus at the University of California, San Francisco School of Nursing, who previously oversaw nursing at U.C. San Francisco Medical Center.
The April 20 IG report also noted that previous inspections had found [lax management, poor supervision, and absence of training were responsible for] nurse competency issues in "dialysis, mental health, long-term care, spinal cord injury, endoscopy procedure areas, the operating room and the cardiac catheterization laboratory and with reusable medical equipment."
In a response to the inspector general, the VA pledged to create uniform competency standards for [management of] its 152 hospitals and to ensure that evaluations of every nurse's [manager's] skills are up-to-date. Nurses Managers, supervisors, and trainers] will not be able to work in areas in which they have not demonstrated [executive and supervisory] competency.
A VA spokeswoman declined further comment.
Nurse [Management] competency has increasingly become an issue in medicine. Hospitals and clinics [create their own procedures and tests for assessing the skills of nurses [management], but their adherence to these policies is spotty.
Outside regulators don't test individual nurses [supervisors], but simply check if a sampling of the nurses' [supervisors'] files have the appropriate paperwork certifying [supervisory]competency.
That's what VA's inspector general did for the April review. As such, officials acknowledged that they could not verify whether [management is doing what was necessary so that] nurses at those hospitals, or others, are providing competent care.
"We did not look at [management's ultimate responsibility for] actual care or actual [management] competence," Julie Watrous, director of the inspector general's combined assessment program, which inspects each VA hospital every three years, told ProPublica.
Only half the 29 facilities included in the new report had complete [management, supervisor, and] nurse skill assessment records that met the hospitals' standards, inspectors found. Of the 349 nurses whose files were examined, paperwork showed that [management was deficient in providing proper supervision and training to nursing staff. This resulted in] 58 [nurses who] lacked skills in at least one area. And for 24 in that group, there was no evidence that anything was done in response[, by hospital management].
In an interview, however, the IG official who coordinated the report said she was generally pleased with the findings. Although both the VA and its hospitals had room to improve, she said, all of the hospitals had policies in place and at least some proof of skills in each nurse's [manager's] file.
"We never found one single site or even person [manager] that didn't have at least components of competency assessment and validation," said Carol Torczon, associate director of the St. Petersburg, Fla., office of the inspector general. "Where we found the holes [in the management system] was in the paper process."
Torczon said she believed that the problems identified in Denver and New York were not reflective on the care generally provided by VA nurses in cardiac monitoring units.
Inspectors in the New York and Colorado cases said they could not definitely tie the deaths of the patients to [the failure of management and supervisors to adequately train and monitor] their nurses' care. But they noted that their [own] lack of [providing proper supervision and] training put patients at risk.
Registered nurses assigned to telemetry units typically place cardiac leads, set parameters for the monitors tracking each patient, verify heart rhythms and take appropriate actions if there is an irregularity. They also enter progress notes and inform doctors of any changes.
After the patient in New York died, inspectors quizzed nurses and a biomedical engineer about what would happen if a patient got disconnected. "According to some staff, a 'red alarm' would be triggered since a disconnected lead was considered critical," the report said, "whereas other staff told us that a disconnected lead would trigger a yellow alarm or that it would not trigger any alarm at all." [Clearly management and supervisors had a lot of work to do to improve training and competence. After all, it's their job.]
Inspectors also found no evidence that the nurses' [manager's] competence had been checked. Records showed [serious management lapse in] that one of the patient's nurses had last received training on the monitors 13 years earlier.
Two years earlier at a VA hospital in Denver, inspectors looked into the deaths of two patients on cardiac monitors. After the first death, the hospital gave nurses a basic test of their ability to interpret monitor readings: only one of 28 passed, according to a January 2010 report. The nurse in charge when both patients died had never received specialized training in cardiac monitors. [This was a clear demonstration of a failure of management and supervisors to train, improve, and monitor staff competence.]
Even after the second patient died in 2009, inspectors found "it was unclear [if management understood] who was responsible for telemetry training, and [that management had no clue that] staff were not aware that policies had been updated."
Both facilities vowed extensive reforms [in management and supervisory practices] in responses that were included in the IG reports.
Experts say up-to-date competency evaluations are important because they ensure that nurses [management and supervisors], who provide [staff training for] the bulk of the frontline care in hospitals, have the skills for their position.
"It would appear that the old adage 'inspect what you expect' has most certainly not been taken very seriously [by mangement] in these environments," said Hirsch, who was chief nursing officer at UCSF Medical Center for nine years.
After reading the New York and Denver reports, Hirsch said her concern wasn't the incidents themselves as much as that the competency of [management in the supervision and training of] the nurses hadn't been documented or evaluated in a long time.
Had she been in charge, the findings would have caused her "to be really nervous and want to jump on [management's asses and have them fix] it immediately," she said.