Skip to main content
Nurse.com Blog

Learning From Mistakes

Nurse checking her phone after work

During a stressful situation in the Saint Agnes Medical Center ICU, a nurse began a bolus of a narcotic for the patient. In a rush, and distracted by distressed family members, the nurse used the pump in such a way that too much medication went in. Fortunately, the nurse noticed the problem right away, and the patient was not harmed. A review of the incident revealed that other nurses used the same practice for bolusing, and the near miss became a learning opportunity for the unit.

?In the ICU, we are prone to error because of the speed with which we do things and the kinds of drugs we use,? says Joyce Eden, RN, MHA, director of med/surg and critical-care services at the Fresno, Calif., hospital. ?After this event, we brought in the entire staff and reviewed the whole chart. We made it clear this was a learning process, an error that we needed to learn from. It helped us change some things to make the ICU safer.?

Medical errors are responsible for between 44,000 and 98,000 deaths in hospitals each year, according to a report from the Institute of Medicine. Even at the lower end of the estimate, that is more people dying from medical errors than from highway accidents, breast cancer, or AIDS. The majority of errors are not the result of individual recklessness, but basic flaws in the way the health system is organized, the report says.

Correcting those flaws requires a culture where reporting safety risks and errors is encouraged, says Kathy Harren, RN, chief nurse executive at Little Company of Mary Hospital in Torrance, Calif. In a formal survey in 2007, the hospital found that many staff worried they would be fired for reporting an error. ?The first debate was how to create blame-free cultures, but then the concern was not holding people accountable,? Harren says. ?So we shifted to a ?Just Culture.? ? Providence used a definition of ?Just Culture? from the Dana-Farber Cancer Institute in Boston Patient Safety Rounds Toolkit: ?giving constructive feedback and critical analysis in skillful ways, doing assessments based on facts, and having respect for the complexity of the situation,? and ?creating effective structures that help people reveal their errors and help the organization learn from them.?

Rewarding staff for reporting errors or near-errors helps them get beyond the fear. ?When staff see the intent isn?t to blame, but a true desire to look at what systems fail, they become part of that,? Harren says. The hospital also made a serious commitment to change policies identified as safety risks. ?It is about cultural transformation, including a vision, guiding principles, and articulation of behavioral expectations for staff,? Harren adds. An important part of the culture is rounds that Harren conducts with board members, where they interview staff members about safety risks they see as direct care providers.

The Saint Agnes culture is modeled after Keystone: ICU, a program developed by the Michigan Health & Hospital Association?s Keystone Center for Patient Safety & Quality that uses Johns Hopkins University?s collaborative model for transformational change. ?The intent was that people feel safe to speak up about what they see that would potentially harm a patient or them,? Eden says. ?It is non-punitive, so when you see defects, you don?t put a Band-Aid on them, but pull them up by the root. The only way to do that is a culture where you feel safe speaking up about issues.?

Administration must be educated that increased reporting is a good sign. ?I would argue, and the literature argues, that a hospital with low reports of medication errors isn?t one with low errors; it?s one where staff isn?t reporting,? Harren says.

System changes

In the Just Culture environment, an error or adverse event is evaluated in the context of whether it is a systems error that needs a systemic fix, or an individual error where a person made a mistake, says Debby Rogers, RN, MS, vice president of quality and emergency services at the California Hospital Association. ?We know if we give professionals a simple math test of 100 questions, they will miss three just due to human error. So you have to look at whether it was human error or reckless behavior.?

Technology can also help minimize human error. For example, Harren?s institution invested in smart pump technology in response to errors in programming patient-controlled analgesic pumps. Barcodes are another tool to decrease the risk of giving the wrong medication or wrong dose. At the bedside, a nurse scans a barcode on the patient?s armband, then one on the medication to ensure the barcodes match. ?We take errors, learn from them, and look to technology for solutions to help,? Harren says. ?Technology won?t eliminate human error, but it can be a great enabler in significantly reducing it.? Staff play a key role in identifying helpful technology.

Many medication error reports have resulted in re-packaging or removing a medication ? perfect examples of safety process changes made as a direct result of errors, Harren adds. ?After all the publicity about heparin errors, I imagine there isn?t a single hospital that isn?t looking at every error as an opportunity to build in safety valves to prevent re-occurrence.?

The Keystone: ICU project demonstrated that an electronic reporting system made staff feel more comfortable and that it usually resulted in an increase in reporting, Eden says. ?We implemented electronic reporting in our ICU, and the number of reports went up.? Collaboration within and between systems allows learning from each other?s mistakes, rather than repeating them.

Accountability

Advocates stress that a Just Culture is not a blame-free one. There are errors that involve a blatant disregard of standards, where it is an organization?s responsibility to hold staff accountable, Harren says. But it is important to get away from the tendency to react based on the outcome. When a patient dies, firing someone won?t make things safer for future patients.

?The No.1 shift in our hospital is that our job is not to look for blame as a first reaction, but rather at what system failed,? Harren says. ?It might turn out there was a blatant disregard, but that has not been the case here.?

Even if many errors are the result of process problems, people have to report the error in order for it to be fixed, which goes back to making it safe for them to do so, Eden says. ?Using evidence-based practice and more solid research has helped us convince doctors and nurses why we want things reported,? she adds. An example is a program implemented to reduce central line infections by treating their introduction as a sterile procedure. ?That means the doctor and anyone else needed to be in full sterile garb and the patient needed a body drape,? Eden says. ?But this is the ICU, not the OR. This was a huge shift. Staff needed to feel safe in speaking up and saying, ?Doc, you didn?t put your cap on yet.? Nurses had to be assertive enough to say, ?We?re not proceeding until we go through the checklist.? We gave them the checklist on a bright yellow sheet of paper, and a procedure cart that had everything on it. Now, three-plus years later, we have evidence that the infections went down. Now no one even blinks when they are reminded. It has become the culture. Staff expect that people are going to speak up and self-identify things that might put patients at risk.?

The Keystone materials stress that senior leadership support is critical. Saint Agnes has administrative champions, and the chief medical and chief nursing officers each do rounds once a month at change of shift. Eden highlights specific safety initiatives and the staff members behind them. Incidents are used as teaching tools. ?When you hear about something, you kind of say, ?That would never happen to us,? ? Eden says. ?But when it?s your story and your patient, you?re involved.? Every staff meeting has a safety item on the agenda.

?That keeps it in front of people that yes, this is an extra task, but look at the results. Six months without ventilator pneumonia, for example. We find little wins and make sure the team is aware of them. That helps you remember why you did it in the first place ? to create the best level of healing we could for every patient we touch.?