Written by students who passed Immediately available after payment Read online or as PDF Wrong document? Swap it for free 4.6 TrustPilot
logo-home
Exam (elaborations)

NUR MISC/Patient Safety:Understanding Adverse Events and Patient Safety,100%CORRECT

Rating
-
Sold
-
Pages
108
Grade
A+
Uploaded on
09-11-2022
Written in
2022/2023

NUR MISC/Patient Safety:Understanding Adverse Events and Patient Safety “First, do no harm.” This phrase is one of the most familiar tenets of the health care profession. If you poll a group of health care professionals, it is likely all would say they strive to embrace this motto in their practice. And yet, patients are inadvertently harmed every day in the health care system, sometimes with severe consequences. Noah Lord was one of these patients. At the age of four, Noah had an operation to remove his tonsils and adenoids, to help with chronic ear infections. Although it was a simple outpatient procedure, a series of miscommunications increased his risk for harm: • Noah’s surgeon did not fully understand Noah’s symptoms or history. • After the procedure, the care team sent Noah home early, possibly without notifying the surgeon. • When Noah’s mother called the hospital five times for help, each time she spoke with people who failed to communicate important information, such as critical warning signs. Noah was at home when he began to bleed profusely from his nose and mouth, where he’d had the surgery. Tanya Lord, his mother, was a lifeguard and knew CPR. She was able to clear his airway and revive him three times, but Noah eventually died before paramedics arrived to help. In the video, she describes how what happened changed her life forever Adverse Events Are Common Although the results are not always as devastating as what happened to Noah, the reality is adverse events in health care happen all the time. Studies of different health care settings in the United States have found:1 . About 1 in 10 patients experiences an adverse event during hospitalization. . Roughly 1 in 2 surgeries has a medication error and/or an adverse drug event. . More than 700,000 outpatients are treated in the emergency department every year for an adverse event caused by a medication. . More than 12 million patients each year experience a diagnostic error in outpatient care. . About one-third of Medicare beneficiaries in skilled nursing facilities experience an adverse event. The consequences of these adverse events can be physical, emotional, and/or financial. People Make Mistakes The simplest definition of patient safety is the prevention of errors and adverse effects to patients associated with health care.1 Patients are inadvertently harmed every day in health care systems, sometimes with severe consequences. And some — but not all — harm to patients is the result of human error. No matter how well-intentioned, well-trained, and hard-working, health professionals are human and make mistakes. Even Dr. Don Berwick, the founder of IHI, will tell you about errors he has made As Dr. Berwick said, “Almost anyone in that situation stood a substantial risk of making that same error.” It’s true. In fact, as the patient safety field has evolved, it has moved away from the term “medical error,” which tends to overemphasize the role of individuals in causing harm. As it turns out, exploring the real root causes of harm means looking far beyond individual providers; it means taking a close look at the systems in which they work. The Evolution of Patient Safety The patient safety movement started as a recognition that health care was causing injury and death to patients in the course of care. Today, the movement encompasses much more: designing health care systems that reliably deliver the best outcomes to patients at the lowest cost. (Click to enlarge the image.) In 1999, the Institute of Medicine (IOM) released its landmark report, To Err Is Human, which revealed that between 44,000 and 98,000 people died each year in United States hospitals due to medical errors and adverse events.2 It did not identify the main cause of the problem to be reckless or incompetent providers. “Faulty systems, processes, and conditions that lead people to make mistakes or fail to prevent them” were usually to blame for patient harm, according to the report. Within days of the report, new legislation tasked the US Agency for Healthcare Research and Quality with studying health care quality. Over the next five years, the rate of English-language articles published on patient safety almost tripled.3 Dr. Bob Wachter is a renowned patient safety expert and author. Listen to him describe how patient safety suddenly became a priority, and how the field has evolved since 1999: Health Care Is a Complex System Most health care professionals — physicians, nurses, pharmacists, and so forth — are drawn to health care out of a desire to help others. They go through intensive training and are carefully screened for their positions. Some take further instruction — such as this online course — to enhance their education and better prepare for patient care duties. Given the conscientious nature of the typical health care provider and the comprehensive training he or she receives, why is it that so many adverse events occur? There are many answers to that question: • Technology is evolving rapidly. The practice of modern medicine involves numerous drugs and highly technical equipment. There are more than 10,000 types of medical devices available today.1 • There is not always a clear right answer. The science of medicine is filled with nuance, and what one health care provider or organization feels is good practice, another may not. A 2002 AHRQ review identified more than 120 different systems to rate the strength of scientific evidence.2 • There is rarely enough time. Providers are often caring for a great number of patients, all of whom are unique. A 2012 study of over 13,000 US physicians found more than 40 percent saw more than 20 patients each day.3 • Patients require complex, coordinated care. Multiple caregivers and patient handovers leave room for miscommunication at every turn. One teaching hospital reported 4,000 handoffs daily, for a total of 1.6 million per year.4 • The hierarchical nature of health care can breed disrespectful and abusive behavior. Multiple US studies find more the 90 percent of nurses experience verbal abuse at some point in their careers. Trainees are vulnerable to disrespect and mistreatment as well.5 Steve Spear, DBA, MS, MS, is a Senior Lecturer at the Sloan School of Management and at the Engineering Systems Division at MIT. He has worked with many organizations to help them integrate new technology, and seen how it creates new risks along with the rewards: Health care today relies on a coordinated effort of people working across many different functions, disciplines, and specialties. All these individual components of the system need to function effectively not only on their own, but also together. Organizational theorists such as James Reason (whom you will learn about in later courses) have described safety as a “dynamic non-event”: Safety is dynamic because it requires “timely human adjustments” and a non-event because “successful outcomes rarely call attention to themselves.”7 In other words, to make “nothing bad happen” requires a lot of good things to be done right. Blame & Punishment Are Not Solutions It is human nature to look for someone or something to blame when things don’t go according to plan. Historically, the medical profession has viewed medical errors and adverse events as either an inevitable byproduct of complex care or the result of provider incompetence, often seeking to blame the providers involved in the error. Yet most of the time these situations occurred unintentionally. You already heard how Dr. Berwick felt after he made a mistake. The next story comes from Dr. Lucian Leape, the founding chairman of the IHI/NPSF Lucian Leape Institute, whom many consider the father of the modern patient safety movement. He shares an experience he had as a pediatric surgeon: The reality is that patients and family members are not the only people harmed by medical error. Providers suffer, too. Blaming people who are doing their best does not solve the problems that lead to error, and it makes health care less safe. Dr. Leape has often said: Delivering the right care — for every patient, every time — requires a different way of thinking about error in medicine and a new approach to preventing harm. Janet is especially busy because one of her colleagues called out sick, and she needs to collect blood samples for four patients. She collects one sample, and before she gets a chance to label it, another patient in an emergency situation needs her help. She leaves the unlabeled specimen on the nurses’ station for a moment. When Janet returns, there is a second unlabeled vial of blood at the nurses’ station. She realizes another nurse was obtaining blood samples and was also called away. Neither nurse knows which vial belongs to which patient. What about this scenario seems to make an adverse event likely to occur? (Choose all that apply.) (A) Janet is juggling multiple tasks at once. (B) There are not enough staff members to keep up with the demands of care. (C) Janet was interrupted in the middle of a task with inherent room for error. (D) Janet is not trying hard enough at her job. More Info There is no reason believe Janet is not trying her best to provide safe, efficient care. Otherwise, all of the other factors make an error more likely to occur in this scenario. Making Systems Safer It took a terrible disaster for the aviation industry to learn that blame and punishment don’t improve safety. In 1977, a pilot named Jacob Van Zanten made an error that led to the collision of two jumbo jets on a runway in Tenerife, killing more than 500 people in aviation’s deadliest accident. After the crash, industry leaders couldn’t blame the accident on mechanical failure or staff misbehavior, as they usually did, because Van Zanten was a revered pilot. In fact, magazines with his image were onboard at the time. Audio transcript: This is the worst air traffic collision of all time, the crash of two 747s on the runway at the Tenerife, Canary Islands, more than 20 years ago. And the story, briefly, was a KLM 747, was getting ready to take off. It was a foggy morning. There was a Pan Am 747 at the end of the runway, and before the fog set in so, so deeply, the KLM cockpit crew saw the Pan Am, but it seemed to be rolling toward a side–spur of the runway and it was logical to believe it was out of the way. But then the fog came in, and they could no longer see the end of the runway. Air traffic control sent a message to the KLM cockpit, and it got garbled, they didn’t hear it correctly. And so- this is from the later report- on hearing this message, the KLM flight engineer asked, “is he not clear then?” meaning is the Pan Am 747 not out of the way. The KLM captain, a revered captain who is actually in charge of training all the 747 pilots in KLM’s fleet, the captain didn’t understand him and he repeated the question, “it he not clear, that Pan American?” The captain replied with an emphatic yes. And perhaps influenced by his great prestige, making it difficult to imagine an error of this magnitude on the part of such an expert pilot, both the pilot and engineer made no further objections. So, in other words, to the best of our ability to reconstruct that scene, two of the three people in the cockpit were not sure that a 747 was not in the way. And yet they green-lighted the plane, they did not object to the captain’s proclamation that the runway was clear because this captain had such great prestige that the authority gradient was simply too large. Of course, you know what happened next- the plane began lumbering down the runway, emerged from the fog to see the horrifying sight of the Pan Am plane right in front of them. They actually managed to get their nose over the fuselage of the Pan Am but to do so, their tail dragged across the ground of the runway, the tail got up in the air about 25-30 feet, just high enough to slam into the upper deck of the Pan Am. Both planes exploded, 583 people died. And so aviation learned the tragic cost of this kind of hierarchy where it could be possible that someone could suspect something was wrong and not speak up to power. And they have worked doggedly to decrease that hierarchy, something we just began to do in health care. Instead of blaming individuals, the aviation industry looked at the system as a whole, and learned how flawed processes and problems with the culture set those people up for failure. Since this shift of focus from blame and punishment to system-level improvement, the number of airline accidents resulting in passenger harm has significantly decreased. The industry has continually improved safety by learning from small mistakes, even those that do not cause harm. Health care can and should do the same. In the next video, Dr. Berwick reflects on what he should have done after the mistake he described earlier: A patient is scheduled for surgery on her left leg. Initially, an intern prepares the patient by marking the correct surgical site on the dressing on the leg. The intern’s teammate removes the dressing, but addresses the problem by marking the surgical site on the patient’s skin. However, he uses a water-soluble marker. The ink becomes smeared and illegible. The attending surgeon is new and not familiar with the hospital’s marking procedures. Meanwhile, the nurses helping with the surgery are busy preparing the patient and the operating room. Generally, the operating room schedule is tight, and everyone is in a hurry to move the surgery forward. What type of adverse event in particular is more likely to occur because of the system failures in this scenario? (A) Improper anesthesia dosing (B) Retained foreign body after surgery (C) Wrong-site surgery (D) None of the above More Info Hopefully, the surgeon and nurses will stop to perform site verification before proceeding with the surgery, and move forward with a successful operation. However, regardless of what happens next, a wrong-site surgery is more likely to occur because of a breakdown in the system. Near misses and even small errors are “accidents waiting to happen,” and they represent important opportunities to improve safety. A Framework for Patient Safety A Framework for Safe, Reliable, and Effective Care to be used in service of realizing the best outcomes for patients and families. Click to enlarge the image and explore each dimension. After he made a mistake, Dr. Berwick wished he and his organization could have used what happened to him to make the system safer. “Nowadays, I hope that health care is maturing into a better kind of system for a worker like me — frail, error prone, human,” he said. The good news is that over the past 20 years, health care organizations have begun to realize and accept that most errors cannot be linked to the performance of the individual, and are instead the result of a series of system failures. Even better news is that these system failures are often preventable. IHI’s patient safety experts have created a framework to help organizations understand the components of safe systems of care, which focuses on two broad areas:1 . Organizational culture, which is the product of individual and group values, attitudes, competencies, and behaviors related to safety . Learning systems, which measure performance and help teams make improvements In the remainder of this course and the Open School Patient Safety Curriculum, we will discuss each element of the framework in detail. We hope you will continue learning with us. Because as Dr. Leape reminds us, we have made great progress in safety, but there is lots of opportunity to improve: Post Assessment Quiz According to WHO, in developed countries worldwide, what is the approximate likelihood that a hospitalized patient will be harmed while receiving care? Your Answer: 10% According to WHO, in developed countries up to 10 percent of hospital patients may be harmed while receiving care. Since the publication of To Err Is Human in 1999, the health care industry overall has seen which of the following improvements? Your Answer: Wider awareness that preventable errors are a problem More than a decade after the publication of To Err Is Human, there is now wide recognition throughout health care that the number of errors is way too high. Although this awareness has not yet led to consistently lower rates of preventable medical error, progress is being made. Health care organizations have begun to realize and accept that most errors cannot be linked to the performance of individuals, but rather to the systems in which they function. Safety has been called a “dynamic non-event” because when humans are in a potentially hazardous environment: Your Answer: It takes significant work to ensure nothing bad happens The best answer is it takes significant work to ensure nothing bad happens. When things go right in a potentially hazardous environment, nothing bad happens. But in order for this “non-event” of nothing going wrong to occur, a lot of things must be done right. Thus, safety has been described as a “dynamic non-event.” To prevent this type of error from recurring in this unit, which of the following is MOST important? Your Answer: An improved culture of safety and teamwork Had there been a culture of safety fostering better teamwork, this error may well have been prevented. In this case, when James asked Maria for help, she made him feel bad instead of being a team player. In this type of environment, James may be reluctant to ask for help, even if he is more closely supervised. We can generally assume that health care providers do not want to harm their patients, so the threat of punishment is not the best way to prevent mistakes. Although errors may occur when there is no recognized best practice, in the case of IV fluid replacement, clear recommendations do exist. Who is likely to be negatively affected by this medical error? Your Answer: All of the above The best answer is all of the above. Patients and families are not the only ones affected when a medical error occurs. In this case, James is likely to be devastated, and Maria may be affected as well. Some providers even leave their profession after committing errors leading to a death. Your Role in a Culture of Safety What Does a Culture of Safety Look Like? In the previous lesson, you saw several examples of people who could have spoken up to prevent accidental harm. A culture of safety would have made it easier for caregivers to voice their concerns, and would have made it more likely that others would respond. In a culture of safety, providers discuss errors and harm openly, without fear of being unfairly punished, and with confidence that reporting safety issues will lead to improvement. Dr. David Bates, MD, MSc, is a Professor of Medicine at Harvard Medical School and leading researcher in the field of patient safety. Here is what he has learned about building a culture of safety: In Lesson 1, we shared a IHI’s framework that identifies the following factors that contribute to an organization’s safety culture:1 . Psychological safety: creating an environment where people feel comfortable raising concerns and asking questions and have opportunities to do so . Accountability: holding individuals responsible for acting in a safe and respectful manner when they are given the training and support to do so . Negotiation: gaining genuine agreement on matters of importance to team members, patients, and families . Teamwork and communication: promoting teams that develop shared understanding, anticipate needs and problems, and apply standard tools for managing communication and conflict In this lesson, we’ll review each of these dimensions of a culture of safety. Two interrelated domains underpin the Framework for Safe, Reliable, and Effective Care: the culture (orange elements) and the learning system (gray elements). In this context, culture is the product of individual and group values, attitudes, competencies, and behaviors that form a strong foundation on which to build a learning system. Every Person Contributes to Culture Unfortunately, many professional and social groups — even groups of family and friends — promote the opposite of psychological safety: They don’t support the value of asking questions, seeking feedback, or suggesting innovations. Before he became Director of the Clinical Effectiveness and Evaluation Unit at the Royal College of Physicians of London, Dr. Kevin Stewart was new to his career and the hierarchy of medicine. He tells the story of how he accidentally hurt a patient while trying to avoid a confrontation with his supervisor: Every person in a system contributes to its culture. What you do influences the behavior of others, whether you’re a supervisor or the newest staff member. No matter who you are, how you behave toward others will make a difference. Psychological Safety Psychological safety is key to reducing the likelihood that a patient will get hurt, as you just saw. But it has other benefits, too, such as innovation and faster adoption of new ideas. Amy Edmondson, a professor at Harvard Business School, is an expert in team performance. In a study of surgical teams, she and co-researchers found that when team members felt comfortable making suggestions, trying things that might not work, pointing out potential problems, and admitting mistakes, they were more successful in learning a new procedure. By contrast, when people felt uneasy acting this way, the learning process was stifled.1 Edmondson explains more about how psychological safety helps groups learn: Every time you work with your peers, even when you’re not the most senior person in the room, you can model behaviors that promote psychological safety: • Make yourself approachable. • Seek to engage all team members. • Encourage feedback. • Respond to suggestions. • Respect and value every team member and his or her input. A brain surgeon was about to perform a difficult procedure on a high-risk patient. Several of the surgical team members had just met for the first time. The surgeon walked into the room and announced, “Good morning, team. This is a difficult case, and I'm human like everyone in this room. Please speak up if you see me about to make a mistake or have a suggestion to help.” She then went around the room and introduced herself to everyone by her first name. Is the surgeon showing good leadership? (A) Yes (B) No More Info By introducing herself, encouraging participation, and valuing everyone’s role, the surgeon helps create an environment in which team members can participate to their full potential and speak up if necessary, to help the group navigate problems that could emerge, especially in a high-risk surgery. The focus on safety is a powerful reminder of the team’s common goal: providing safe, effective care for the patient.2 Accountability A just culture, initially defined for health care by the lawyer and engineer David Marx, recognizes that competent professionals make mistakes. However, it has zero tolerance for reckless behavior. This distinction has two benefits: • People know that certain kinds of behavior are not acceptable. • People know that they won’t be punished for admitting to errors that happen when they’re trying to do the right thing. Patient safety expert Fran Griffin, RRT, MPA, explains the defining characteristics of a just culture: As Fran Griffin said, unsafe actions based on reckless decision making are not acceptable. However, culpability can be a difficult line to draw. David Marx recommends distinguishing between three types of human behavior, defined as follows:1 • Human error: inadvertently doing something other than what you should have done. For example, you suddenly notice while driving that you’ve exceeded the speed limit, without intending to do so. • At-risk behavior: making an intentional behavioral choice that increases risk — without perceiving that heightened risk or believing the risk is justified. An example is when consciously driving 72 miles per hour feels safe to you, even though the posted speed limit is 65 miles per hour. • Reckless behavior: consciously disregarding a visible, significant risk. Reckless behavior, such as driving drunk, involves choosing to put oneself and others in harm’s way. You promote a culture of safety when you speak up about unsafe acts while recognizing that even competent, well-meaning professionals will make errors. Take a guess: For which of the above categories do you think Marx recommended disciplinary action? (A) Human Error (B) At-risk behavior (C) Reckless behavior (D) None of the above More Info In Marx’s model, he suggests disciplinary action should be reserved only for cases that fall into the third category, “reckless behavior.” For cases classified as “human error,” he recommends system-level improvement. For “at-risk behavior,” he suggests system- level improvement plus coaching. Algorithms based on Marx’s work can help you assess the best approach when something goes wrong and a patient is harmed. For example, IHI recommends asking five questions:2 • Were their actions malicious? Did they intend to cause harm? • Was their sensorium intact? Were they thinking clearly? • Were their actions reasonable and appropriate? • Were their actions risky, reckless, or unintentional? • Does the individual have a history of unsafe behavior? Teamwork and Communication No matter what role you play in health care, you will be a member of a team, and thus you have a responsibility to communicate effectively and value the contributions of other team members. The processes of health care are too complex to be safely carried out by individual experts who try hard. In Lesson 1, Dr. Lucian Leape told the story of a child who died under his care after no one challenged his clinical judgment. In the next video, he explains what the experience taught him about the importance of teammates: When patient care teams function as a group of individual experts and don’t take deliberate steps to ensure safety, they can inadvertently increase the risk of error and patient harm. On the other hand, when health care teams function as a collaborative unit with regular two-way communication, they promote safe, reliable, and effective care. During each team interaction, team members know the plan, and there is a dynamic that supports psychological safety. Negotiation Given the level of complexity of health care and the importance of many of the daily decisions teams must make, providers need to be able to negotiate effectively to gain genuine agreement. Whenever possible, collaborative negotiation is best. In collaborative negotiation, both parties work together to find a mutually agreeable solution through appreciative inquiry (asking simple questions to gain greater insight into the other person’s needs and interests) and self-reflection (working to understand your own desires and interests). So far, we have looked at examples of building a culture of safety in behavior toward colleagues. It is just as important to practice the same skills with patients. When it comes to negotiating and collaborating effectively with patients, sometimes it is fear that gets in the way. Patient safety expert Barbara Balik, RN, shares advice to work more collaboratively with patients in spite of time pressure virtually all providers feel: Patients and Families Another reason to practice strong teamwork and communication skills with patients and families: They can play an important role in improving safety. You met Tanya Lord in the previous lesson. She lost her son, Noah, to harm from the health care system. Before Noah died, Tanya knew something was wrong, but no one would listen: The are a number of potential roles that patients and family might play in patient safety.1–3 • Helping to identify adverse events. • Helping to inform clinicians about adverse events they are not aware of. • Behaving as advocates for their own health. Dr. Saul N. Weingart, Chief Medical Officer at Tufts Medical Center, shares some of his research findings related to the above: Post lesson Assessmnet One hospital CEO insists on including performance data in the hospital’s annual report. “We do very well on most measures, except for one or two, but we put those in anyway,” she says. “We want to hold ourselves accountable.” Does this practice demonstrate effective or ineffective leadership? (A) Ineffective leadership: Because results are an important indication of leadership, publicly sharing poor results is an unwise practice. (B) Effective leadership: Being transparent, even about poor results, is a mark of a good leader. (C) Ineffective leadership: Leaders are people who have followers, and sharing poor results might cause the leader to lose some followers. (D) Effective leadership: It is good to share results in the annual report, but the leadership would be even more effective if it shared only the strongest results. Good leaders know that leaders are highly visible — and they therefore set examples for others. A leader who seeks transparency in her followers must demonstrate the same quality herself. Use the following scenario to answer questions 2 and 3: At the large multi-specialty clinic in which you work, there have been two near misses and one medical error because various clinicians did not follow up on patient results. Different caregivers were involved each time. When asked why they failed to follow up, each caregiver said he or she forgot. Based on what you know, how would you classify the caregivers’ behavior? (A) Human error (B) At-risk behavior (C) Reckless behavior (D) None of the above The best answer is human error, as there is no reason to believe the caregivers’ acted with intentional disregard for safety. The fact that multiple people made the same mistake further suggests the problem was due to a poorly designed system rather than at-risk or reckless behavior by individuals. A nurse who realized that his colleagues weren’t consistently following up on patient results reported the problem to the clinic leadership right away. Which response would be most consistent with a culture of safety? (A) Transferring the nurse to another clinic (B) Investigating the problem and seeking systems solutions (C) Thanking the nurse and asking him to keep quiet about it (D) Placing the item on the agenda for the leadership meeting next year The best answer is investigating the problem and seeking systems solutions. An organization must develop a method to surface and learn from defects and harm that occurs to patients. We know that incident reports are one way to learn. They can also be an indicator of the culture of the organization. That is the more people are willing to report, the safer they feel. Why is psychological safety a crucial component of a culture of safety? (A) Without it, people won’t be interested in improvement work. (B) It allows people to remove unsafe members of the team quickly. (C) Without it, patients will not follow their doctors’ advice. (D) It allows people to learn from mistakes and near-misses, reducing the chances of further errors. In psychologically safe environments, people understand that making mistakes is rarely a sign of incompetence, and that they won’t be judged for discussing mistakes. Because of that, people are able to call out errors – whether their own or others’ – and improve the processes that made the errors possible. A medical unit in a hospital is in the midst of hiring some new physicians. During an orientation for new employees, a senior leader stands up and says, “We expect that the same rules apply to everyone on the unit, regardless of position.” Which aspect of a culture of safety does this unit seem to value? (A) Psychological safety (B) Accountability (C) Negotiation (D) None of these Holding all employees to the same standards of professional behavior, regardless of position, is an example of accountability. Your Role in Building Safer, More Reliable Systems Learning from Adverse Events If blame and punishment aren’t appropriate responses to adverse events, what are? Consider the following story from Sorrel King, a mother whose initial reaction to a catastrophic health care event was blame and anger — but who ultimately chose another path Josie’s parents created a patient safety program at Johns Hopkins Children’s Center, focusing on systems causes of error. “For me, maybe it would have been easier to pin it on that one nurse,” King said. “But it wasn’t her fault; it was the system’s fault.” By examining the events that led to an error, health care organizations can take a learning approach to responding to error and unintended events. They can look for reasonable system changes to prevent the same problems from happening again. Here’s a reminder of the factors that contribute to the organization’s learning system, which we laid out in Lesson 1:1 . Improvement and measurement: strengthening work processes and patient outcomes using improvement science, including measurement over time . Transparency: openly sharing information about the safety and quality of care with staff, partners, patients, and families . Continuous learning: regularly identifying and learning from defects and successes . Reliability: applying best evidence and promoting standard practices with the goal of failure-free operation over time In this lesson, we’ll review each of these dimensions of safe, reliable, and effective care. Two interrelated domains underpin the Framework for Safe, Reliable, and Effective Care: the culture (orange elements) and the learning system (gray elements). The quality of the learning system is defined by the ability to self-reflect and identify strengths and defects, both in real time and in periodic review. How Complex Systems Fail For anyone who works in health care, this lesson will explain your responsibility to recognize problems and drive improvement in your own work, with the larger system in mind. Steve Spear, whom we introduced in Lesson 1, can offer a quick example of why this is so important: Dr. David Bates, whom you heard from in the previous lesson, has studied adverse events in health care extensively. He has found that in every case of error that leads to death or injury, there are ten more errors that have the potential to cause serious harm but for some reason don’t. These errors are weak signals that something is wrong with the system.1 In health care, examples of weak signals could be: • A nurse inadvertently picks up a multi-dose vial of insulin instead of heparin, but notices right before he injects the insulin into the line. • A nurse injects the wrong medication into the patient, but it is an antibiotic that doesn’t cause an adverse reaction. When people in a system sidestep weak signals of potential harm (literally, in Steve Spear’s example) and fail to address safety concerns head on, it’s called a workaround. Workarounds are dangerous because they allow a problem in the system to continue to exist — setting people up to eventually experience failure. Who’s to blame for Hannah’s fall? Who’s to blame for Hannah’s fall? Check all that apply. (A) Tom (B) Daria and Deepa (C) Joe (D) Rachel (E) Hannah (F) All of these people (G) None of these people More Info In one sense, all of these people are to blame. You might say: . “Hannah should have been more careful.” . “Rachel should have seen the situation for what it was and shouldn’t have interrupted her.” . “Deepa and Daria are to blame because they stepped over the cord.” . “Tom was the one who stepped over it in the first place, so the mess that occurred is his fault.” The fact is, any one of them could have recognized the risk and taken steps to remedy the dangerous situation. But the true cause of Hannah’s fall goes beyond any one person’s actions — and ultimately goes to the very top of the organization. In this imaginary organization, leaders failed to set the expectation that people should recognize and speak up about small safety hazards they observed as part of daily work. That failure led to Hannah’s fall. Unfortunately, in high-risk organizations, the consequences can be much worse. Reliability After the Space Shuttle Columbia disintegrated and killed seven crew members, investigators reported: “With each successful landing, it appears that NASA engineers and managers increasingly regarded the [small problems they saw] as inevitable, and as either unlikely to jeopardize safety or simply an acceptable risk.”1 Often, the little problems and workarounds that crop up in the daily routine become so familiar that people start assuming they’re completely normal, a phenomenon called normalizing deviance. Normalizing deviance is a problem because it erodes reliability. Reliability is the ability to successfully produce a product to specification repeatedly. In health care, that “product” is safe, efficient, person-centered care. IHI Executive Director Frank Federico, RPh, explains what makes health care processes reliable: When you choose not to follow a standard operating procedure, you make systems less reliable and put patients at risk. What should you do instead? If the protocol is unclear, takes too long to follow, or is not the best solution to the problem, speak up. Speaking up is the first step toward learning and improvement. Improvement and Measurement You’ve seen now what happens when busy people use workarounds and ignore weak signals. Unfortunately, this is what usually happens. One study observed nurses worked around operational failures 90 percent of the time, ignoring weak signals such as the pharmacy sending incorrect medication doses, broken or missing equipment, and supplies being out of stock.1 In the video, Steve Spear explains the opportunity each person has to identify vulnerability in the system and take action to correct it: On a hectic day at work, imagine you mistakenly hook up a patient’s oxygen supply to compressed room air instead of forced oxygen. You realize and correct the mistake before any harm comes to the patient. Should you report the error? (A) Yes (B) No More Info Yes. When people report errors, whether they have negative consequences or not, organizations can learn from them. Your error was a weak signal of a problem in your system — one that didn’t cause harm in the moment, but could very well harm a patient in the future. Many health care organizations use internal voluntary reporting systems for the purpose of capturing data about errors. While these systems can take many forms (e.g., they may be electronic or paper based, anonymous or open), in most cases anyone can complete and submit a voluntary error report at any time. Transparency Transparent organizations track performance and have the courage to display their work openly. We will say it one more time: Health care professionals have a responsibility to speak up about weak signals. And organizations have a responsibility to respond. Typically, error reports are sent to the risk management, patient safety, or quality department for review and follow-up as well as to the manager of the department in which the error occurred, such as to the pharmacy director for medication-related errors. The organizational response should include: • Acknowledging the issue • Thanking the individual for reporting it • Maintaining communication about what is being done to prevent such an issue in the future On the whole, operational transparency exists when people at all levels can see the activities involved in the learning process: leaders, staff, patients and their families — even other organizations and the community at large. In addition to allowing people to learn, transparency among patients and staff builds trust. Dr. Michael Leonard, a physician leader for patient safety at Kaiser Permanente, made an error more than a decade into his career as a cardiac anesthesiologist: He accidentally administered the wrong medication and re-paralyzed a patient whose surgery was complete. Here is what Dr. Leonard did after the error, and what he learned from the experience: Continuous Learning Conditions that allow organizations to continuously see and solve small problems include: • The people doing work must recognize they have a problem. (Meaning, there must be clear standards for what “normal” should be.) • Someone must be responsible for solving that problem. • The people doing work must be able to notify the responsible person in a timely way. • The responsible person must show up without unfair blame and with a desire to solve the problem collaboratively. • There must be enough time and resources to solve the problem. • Feedback loops must provide data back into the various reporting systems to share information and generate insights to prompt action and learning. The reality of today’s health care environment is that the systems that support patient care are complex and error prone, and most organizations lack a comprehensive method for making them less so. This is the reason patient advocates like Sorrel King and Tanya Lord continue to tell their stories: Increasingly, more time and focus are being placed on proactive rather than reactive learning, to prevent tragedies like what happened to Noah. The Framework for Safe, Reliable, and Effective Care is designed to guide organizations and providers on their journey of improvement, with patients and families at the center. Other courses in the curriculum will review each interconnected dimension more closely. Additional Resources Take the Next Step in Your QI Training: Move from Theory to Action Every day, people all over the world are making differences on a local level. The Quality Improvement Practicum is a 9-week online course with coaching that helps you lead a small improvement project in your local setting. Whether you want to streamline a process in your system, improve outcomes for your customers, or feel happier in your own daily work, IHI experts will support you on a journey toward meaningful change. Post Lesson Quiz Which of these is a behavior providers should adopt to improve patient safety? (A) Develop ways to work around broken systems. (B) Ignore patients’ individual preferences when they disagree with “best practice.” (C) Follow written safety protocols, even if they slow you down. (D) Obey your superiors without question. Safety protocols are in place for a reason, and you should follow them, even if they slow you down. Sometimes there will be a problem with a policy or procedure, in which case you should report it, rather than inventing a “workaround” (a method to circumvent a problem without fixing it). Likewise, you should speak up if you believe any colleague — supervisors included — is threatening patient safety. Part of patient-centered care is respecting patient autonomy, even if it means considering different treatment approaches than what you would normally consider “best practice.” You’re an administrator at a hospital in a fast-growing suburb. Your hospital has hired three new orthopedic surgeons, including a new chief. These new hires are likely to triple the number of knee replacements done in your hospital. Currently, this procedure is done infrequently, and each time it feels a bit chaotic. As you consider the number of individuals with specialized skills required to execute a safe, effective knee replacement (nurses, surgeons, and anesthesiologists, as well as pre-operative, operating room, and post-operative staff), you realize that this process has the properties of a complex system. A few weeks after the new chief of orthopedic surgery comes on board, she has a moment of inspiration and sketches out a new, radically different way for patients to “flow” through the pre-operative, intra-operative, and post-operative phases. She sends you an email saying that she wants you to meet with her Monday morning to begin implementing it. Which of the following should you keep in mind as your hospital redesigns the way it handles knee replacements? (A) Planning by a multidisciplinary team should allow for the development of an excellent, high-functioning system on the first try. (B) Planning a new complex system for health care delivery has little in common with planning an industrial production process. (C) How system components are integrated with one another is as important as how well they function independently. (D) To ensure buy-in, the leader of the design process should be as high up in the organizational hierarchy as possible. Any complex design process should begin with excellent component processes and materials. But such components will not, by themselves, result in an excellent overall result. How components (and component processes) are integrated is a key to overall outcomes. This is as true for a medical care process as it is for an industrial design process. Even with a committed multidisciplinary team, it is very rarely, if ever, possible to get everything right on the first try. Finding flaws after initial implementation (and opportunities for further improvement) should be expected and embraced. While commitment to innovation, excellence, and continual improvement should be supported from the very top of an organization, the actual leadership of the design process should be at the level that will serve best to engage those who have the deepest knowledge of the workflows and component activities, and can engage the multidisciplinary design team. Which of the following is typically true of “weak signals”? (A) They usually result in harm to caregivers or patients. (B) They are uncommon. (C) They can combine with other human or environmental factors to result in catastrophe. (D) They should only be called out by specifically designated individuals within a health care organization . Weak signals that could be used to identify system deficiencies are common — and usually ignored. This is understandable since, by themselves, such signals do not result in direct harm. It is only when they combine with other factors that harm (and sometimes catastrophe) results. Examples in and out of health care abound, including NASA’s Columbia Space Shuttle disaster, which, if the response to such signals had been more robust, could have been prevented. Since weak signals occur in daily work at all levels of an organization, each individual must see it as part of his or her job to identify and respond to such signals (or to “escalate” the problem up the hierarchy so that it can be fixed). The term “normalized deviance” refers to: (A) Acceptance of events that are initially allowed because no catastrophic harm appears to result. (D) The standard deviation of a variable in a “bell curve” distribution. (E) The increase in disturbing song lyrics in modern music. (F) Innovation based on observing positive outliers in a production process. Paradoxically, the fact that weak signals do not result in harm is what makes them most dangerous. When a weak signal is ignored (perhaps many times) and no harm results, workers integrate it into their conception of what is normal. Statements like “we always do it that way” may indicate underlying complacency. This acceptance of unsafe, ineffective, or inefficient routines is called normalized deviance. You meet with the nurse administrator responsible for improvement when issues in the process of care are identified by those on the wards. She listens carefully to your concern, but in the end says she can only try to help improve nursing issues, and not those that extend to pharmacy or transport. The primary reason your meeting is unlikely to lead to an adequate solution is: (A) No one is identified as responsible for improvement when abnormalities in the process of care are identified. (B) The responsible individual belittled the nurse reporting the problem. (C) The nurse administrator did not have the appropriate span of responsibility to engage the system components needed to solve the problem. (D) Since things have been going along without a serious adverse event for several months, it appears that the current work-around is effective. Steve Spear identifies a number of steps needed to fix problems in a production system. They include recognizing abnormalities; having an identified person to call, with the knowledge, attitude, and responsibility necessary to find a solution; and giving workers the time and resources to solve the problem. In the case of health care, this means treating the “system” as well as the “patient.” The challenge here is that even though someone is designated, and that person may have the time to fix how work is done, the nurse administrator may not have the perspective and authority to work across boundaries of specialty, function, and discipline. The Swiss Cheese Model Why Did Nora Get an Infection? Though born prematurely, Nora Boström was a lively toddler with long curly hair when she began having fainting spells at age 3. Doctors in Palo Alto, California, put her on a medicine to help her lungs grow stronger. The medication was delivered by a central line catheter — a tube that went through a vein in her arm and into her heart. Because of the catheter, Nora became infected. She got one central line–associated bloodstream infection (CLABSI), and then three more in a year. Just before her fourth birthday, she died in a hospital in her mother’s arms. No one is sure whether the infections contributed to her death, but the question remains: Why did Nora get these infections? Understanding the relationship between error and harm is the first step to building safer systems. Doug Bonacum, the former Vice President of Quality and Safety at Kaiser Permanente health system, shares why this is more important than ever: Video Transcript: A New Way of Thinking Doug Bonacum, CPPS; Vice President, Quality, Patient Safety, and Resource Stewardship, Kaiser Permanente With an industry filled with professionals drawn to helping others, who go through intense medical training, and are carefully screened for their positions, why is health care so dangerous? The first thing we must appreciate is how complex the practice of medicine has become. Even with all the medical research that has been done over the past hundred years, there is still not a high degree of agreement on what constitutes the best and safest practice. Diagnosis and treatment are often performed under some degree of uncertainty and medication monitoring, particularly in the outpatient setting, is quite challenging. For our frontline practitioners, there are always new medications, new technologies, new procedures, and new research findings to assimilate. It can be overwhelming. Patients are becoming increasingly complex and the diversity of the workforce grows at an increasing rate. Providing safe reliable care has never been more challenging than today. One thing that is clear to all of us working on this issue today is that we can’t solve our patient safety problems by using the same kind of thinking that created them in the first place. To make things right for every patient every day will require a new way of thinking about error in medicine and a new approach to preventing harm. This new way begins with a deeper appreciation for error causation and error prevention In this course, we’ll take a look at how human error relates to harm, and what these concepts can teach us about how to improve health care. We’ll start with a helpful way to understand harm: the Swiss cheese model. The Swiss Cheese Model In health care, hazards are everywhere — powerful drugs, complicated procedures, and very sick patients are the norm. Serious adverse events are almost always the result of multiple failed opportunities to stop a hazard from causing harm. James Reason, a psychology professor and one of the seminal thinkers in the field of human error, has called this idea the Swiss cheese model of accident causation — meaning, the idea that harm is caused by a series of systemic failures in the presence of hazard.1 In this model, the cheese represents successive layers of defense in your organization’s safety system. For example, to prevent CLABSIs, providers wear sterile gowns, wash their hands, follow checklists, and remove the catheter as soon as possible; in most cases, safety systems such as these prevent hazardous situations from leading to harm. But they don’t always. In the next video, Doug Bonacum will explain the Swiss cheese model, and how the holes in the cheese represent each opportunity for failure within each layer of defense:2 Video Transcript: The Swiss Cheese Model Doug Bonacum, CPPS; Vice President, Quality, Patient Safety, and Resource Stewardship, Kaiser Permanente Let’s consider the delivery of medicine to be the inherent hazard in our industry. Ingesting a medication, having surgery, being placed on bypass, undergoing dialysis, or receiving radiation for example, all come with significant health benefits, but they are not risk-free. In addition to the hazards inherent in medicine, accident analysis has revealed the human contribution to adverse outcomes predominates. Taking a step back, Reason asserts that the setup for an accident to occur in a system begins with fallible decisions made by top level leaders. These decisions are then transmitted via line management and then ultimately to the point of production, or the point of care in our industry, where so called preconditions, or qualities of human behavior in production coexist including attributes like the level of skill and knowledge of the workforce, the work schedules, technology, equipment, and maintenance programs, along with the individual and collective attitudes and motivation of the workforce itself which creates its culture. In summary, Reason conceptualized the trajectory of accident opportunity being one which begins with what he calls latent failures at managerial levels’ It proceeds with complex interactions as the impact of the management decisions get closer and closer to the point of care, and is neither stopped nor mitigated by one or more levels of defense that were designed to reduce the risk of harm associated with unsafe acts in the very first place. This is the Swiss Cheese Model of accident causation. The Swiss Cheese analogy here is that all of the holes or defects in the various levels of the system align to turn a managerial deficiency into an adverse outcome, and the challenge for safety professionals is that because this alignment occurs so infrequently, it is difficult to get the organization to attend to the risks. Those that are active and present every day such as not washing ones hands, and those that are latent or lie dormant in system such as the choice of where to place hand hygiene dispensers in the first place. Latent Conditions and Active Failures Have you ever tried to pull a door open when it can only be pushed open? In this image, the poor design of the door is a latent condition that makes human error more likely to occur: Dr. Rollin Fairbanks, an Associate Professor at Georgetown University and Director at the MedStar Health National Center for Human Factors, took this photo at his local bagel shop after watching a series of patrons try to pull the door open despite the sign that clearly says “push.” How many errors do you think could be prevented if the door had a horizontal bar instead of a handle? Video Transcript: Human Factors in Everday Life Rollin J. (Terry) Fairbanks, MD, MS, CPPS; Director, National Center for Human Factors in Healthcare, MedStar Institute for Innovation; MedStar Health Attending Emergency Physician, MedStar Washington Hospital Center; Associate Professor of Emergency Medicine, Georgetown University There are other factors that might lead us to believe that in retrospect that we should blame the human in the system for this error, such as knowledge that fire code requires the doors in public facilities to push out, and the fact that they came in the door in the first place, which makes us think that they should know how the door operates. So there are many factors that might lead the human in this system to do the right thing when they approach the door. So why did they do the wrong thing so often? Well they did the wrong thing because the design of the door is inconsistent with known human factors engineering design principles. The design of the door provides very strong cognitive cues that tell the user to pull on the handle; it's a pull handle after all, and our brains have learned this association over time. The human factors term for this kind of learning is affordance, and this is a skill-based error. As you approach doors, you don't think about the task at hand. Instead, in skill- based mode, you're constantly receiving cues from the environment about what to do. If the design is not savvy from a human factors standpoint, then the cues can lead you to do the wrong thing. In the Swiss cheese model, the holes in the cheese represent both latent conditions and active failures. Latent conditions are defects in the design and organization of processes and systems — things like poor equipment design, inadequate training, or insufficient resources. These errors are often unrecognized, or just become accepted aspects of the work, because their effects are delayed. Latent conditions lead to active failures — easily observed by Dr. Fairbanks or anyone else. Active failures are errors whose effects are seen and felt immediately: someone pushing an incorrect button, ignoring a warning light, or grabbing the wrong medication. In health care, the person on the front line — e.g., the doctor, nurse, pharmacist, or technician — might be the proximal “cause” of the active error, but the real root causes of the error have often been present within the system for a long time. The example shows how a series of contributing factors, including both latent conditions and an active failure, could lead to a medication error. Swiss Cheese Model: Tenerife Disaster Monument by Dutch artist Rudi van de Wint erected in memory of the victims of the Tenerife airport disaster (March 27, 1977), the deadliest air crash in history. You may recall this case from PS 101: Introduction to Patient Safety: We discussed an accident that led the aviation industry to rethink blame as a response to error. Now, let’s look at how the accident happened — and how it illustrates the Swiss cheese model of harm. Video Transcript: Learning from the Tenerife Disaster: Latent Conditions John Nance, pilot, aviation analyst for ABC News, and advocate for patient safety Jacob, this particular day, is a very upset guy. The reason he's upset is because he's had to divert to a place he didn't want to go. See, the chief pilot has to fly every now and then, just like the director of the medical staff has to stay current, and he's gotten out and gone in a 747. He's taken this charter down to the Canary Islands, and he's going to turn around pick up another group of passengers, take them back to Amsterdam, then he’ll get back to the office and somebody else will fly the airplane on. Except it's not working that way. Murphy has gotten in the works and there's been a bomb threat over Las Palmas, which is the main airport of the Canary Islands, so he's had to divert with several other airplanes to another airfield with a slightly higher altitude, with only one runway — it's kind of short, and it’s fog-bound today. It’s not really fog; It’s clouds blowing across the runway. But it's just the same. He's had trouble getting his fuel. He's had trouble getting out of there, and he's about to run out of the thing we call “crew duty time.” This is the maximum amount of time that an air crew may remain on duty before we've got to put them to bed to get them some rest. I know this is a completely foreign concept to health care. Anyway, Jacob’s an upset guy this day, because he's finally gotten his airplane started. He has 10 minutes to get this huge 747 to the end of this fog- bound runway and get off the ground without running out of crew duty time. And here's the penalty if he runs out. He's got to put everybody to bed at Las Palmas, buy $30,000 worth of hotel rooms and delays, and it's going to be very embarrassing. He wants to go, and as they get the airplane down the field, they have to taxi down the runway about halfway, because there are no taxiways stressed for a 747 in the first part. Then he has to turn around, get on the taxiway, come to the end, turn around, line up with the runway, they can only see about 300 yards into the fog, and as they line up, the first officer who was very senior at KLM but very junior to Jacob Van Zanten, has never flown with Jacob before, and the second officer who's very senior at the airline as a second officer and flight engineer, but very junior to the first officer, who's very junior to the captain … you get the hierarchy. As they get in position, the first officer and co-pilot sees the captain’s right hand coming forward on the throttles with these four huge JT 90 engines, 50,000 pounds of thrust a piece, and he knows that they don't have a clearance to take off. On top of that, they don't have what's known as an air traffic control clearance to actually go over to Las Palmas, and he turns toward his commander with wide eyes and says, “Sir, we don't have a takeoff clearance.” And Jacob Van Zanten pulls the throttles back, and — in the inimitable fashion that all of us who qualify as airline captains learn — he says, “I knew that. Get the clearance.” The first officer punches the button, talks to the tower, asks for the clearance. The clearance is read to him, he reads the clearance back. We have a little linguistic disconnect here, because the guys in the KLM cockpit speak Dutch, that's their native language, and yet they're communicating by radio in a thing we call “aviation English,” which is kind of a stylized version of English. The fellow the tower speaks Spanish because this is a Spanish possession. He is trying to communicate in another language, aviation English, and there is even an air crew on the field moving around out there, who, according to my British friends, don't speak English at all — Pan-American. I'm told we speak American — we don't speak English. At any rate, there is a linguistic disconnect, so when the first officer is finishing his read-back and notices with increasing horror that the captain's hand is once again coming forward on the throttles, they have now the air traffic control clearance but they still don't have the physical clearance to take off. These are two separate clearances required. By the way, this guy is not a dummy, this is not a dumb individual, this is you or me sitting in that right seat. This is everybody in this room who has ever been in a position to see a senior individual doing something for the second time, and you got by with it first. You were well-treated when you pointed it out the first time. Do you really want to tell them again that they're fouling up? He would like to find another way to do it, so the first officer, knowing that the captain is starting the takeoff roll without permission, keeps his finger on the transmit button and says, “And we are at takeoff, KLM 1422.” The problem with this is that it doesn't make sense in aviation English. We are at takeoff, but as we all do as human beings, we fill in the blanks. Don’t we? You expect to hear something, and it's close, so you just go ahead and fill in the blanks. We are at takeoff, we are in take-off position. Yeah, yeah, that’ll work, that's what he means, we're going to take off position. However, there's something wron

Show more Read less
Institution
Course

Content preview

NUR MISC/Patient Safety:Understanding Adverse Events and Patient
Safety
“First, do no harm.” This phrase is one of the most familiar tenets
of the health care profession. If you poll a group of health care
professionals, it is likely all would say they strive to embrace this
motto in their practice. And yet, patients are inadvertently
harmed every day in the health care system, sometimes with
severe consequences.

Noah Lord was one of these patients. At the age of four, Noah had
an operation to remove his tonsils and adenoids, to help with
chronic ear infections. Although it was a simple outpatient
procedure, a series of miscommunications increased his risk for
harm:


• Noah’s surgeon did not fully understand Noah’s symptoms or history.

• After the procedure, the care team sent Noah home
early, possibly without notifying the surgeon.

• When Noah’s mother called the hospital five times for help,
each time she spoke with people who failed to communicate
important information, such as critical warning signs.

Noah was at home when he began to bleed profusely from his
nose and mouth, where he’d had the surgery. Tanya Lord, his
mother, was a lifeguard and knew CPR. She was able to clear his
airway and revive him three times, but Noah eventually died
before paramedics arrived to help. In the video, she describes
how what happened changed her life forever

Adverse Events Are Common
Although the results are not always as devastating as what
happened to Noah, the reality is adverse events in health care
happen all the time. Studies of different health care settings in
the United States have found:1

,. About 1 in 10 patients experiences an adverse event

, during hospitalization.
. Roughly 1 in 2 surgeries has a medication error
and/or an adverse drug event.
. More than 700,000 outpatients are treated in the
emergency department every year for an adverse event
caused by a medication.
. More than 12 million patients each year experience a
diagnostic error in outpatient care.
. About one-third of Medicare beneficiaries in skilled
nursing facilities experience an adverse event.

The consequences of these adverse events can be physical,
emotional, and/or financial.


People Make Mistakes
The simplest definition of patient safety is the prevention of
errors and adverse effects to patients associated with health
care. 1

Patients are inadvertently harmed every day in health care
systems, sometimes with severe consequences. And some — but
not all — harm to patients is the result of human error. No matter
how well-intentioned, well-trained, and hard-working, health
professionals are human and make mistakes. Even Dr. Don
Berwick, the founder of IHI, will tell you about errors he has made

As Dr. Berwick said, “Almost anyone in that situation stood a
substantial risk of making that same error.” It’s true.

In fact, as the patient safety field has evolved, it has moved
away from the term “medical error,” which tends to
overemphasize the role of individuals in causing harm. As it turns
out, exploring the real root causes of harm means looking far
beyond individual providers; it means taking a close look at the
systems in which they work.
The Evolution of Patient Safety

, The patient safety movement started as a recognition that health
care was causing injury and death to patients in the course of
care. Today, the movement encompasses much more: designing
health care systems that reliably deliver the best outcomes to
patients at the lowest cost. (Click to enlarge the image.)

In 1999, the Institute of Medicine (IOM) released its landmark
report, To Err Is Human, which revealed that between 44,000 and
98,000 people died each year in United States hospitals due to
medical errors and adverse events.2

It did not identify the main cause of the problem to be reckless or
incompetent providers.

Written for

Course

Document information

Uploaded on
November 9, 2022
Number of pages
108
Written in
2022/2023
Type
Exam (elaborations)
Contains
Questions & answers

Subjects

$16.49
Get access to the full document:

Wrong document? Swap it for free Within 14 days of purchase and before downloading, you can choose a different document. You can simply spend the amount again.
Written by students who passed
Immediately available after payment
Read online or as PDF

Get to know the seller

Seller avatar
Reputation scores are based on the amount of documents a seller has sold for a fee and the reviews they have received for those documents. There are three levels: Bronze, Silver and Gold. The better the reputation, the more your can rely on the quality of the sellers work.
ElonMusk Yale School Of Medicine
Follow You need to be logged in order to follow users or courses
Sold
186
Member since
4 year
Number of followers
163
Documents
1345
Last sold
5 months ago
chemistry

FOR THE BEST ASSIGNMENTS AND HOMEWORKS ,TO HELP AND TUTORING ALL KIND OF EXAMS I have done papers of various topics and complexities. I am punctual and always submit work on-deadline. I write engaging and informative content on all subjects. Send me your research papers, case studies, psychology papers, etc, and I’ll do them to the best of my abilities. Writing is my passion when it comes to academic work. I’ve got a good sense of structure and enjoy finding interesting ways to deliver information in any given paper. I love impressing clients with my work, and I am very punctual about deadlines. Send me your assignment and I’ll take it to the next level. I strive for my content to be of the highest quality. Your wishes come first— send me your requirements and I’ll make a piece of work with fresh ideas, consistent structure, and following the academic formatting rules. I'm an expert on major courses especially; All AQA, OCR, A & AS LEVELS AND GCSE, Chemistry, Psychology, Nursing, Mathematics. Human Resource Management. Quality work is my priority. I ensure scholarly standards in my documents. Use my work for GOOD GRADES. In requirement of case studies, test banks, exams and many other studies document our site helps in acquiring them all. If in need of any revision document you can go to the inbox and you will be attended to right away. SUCCESS and BEST OF LUCK.

Read more Read less
4.2

42 reviews

5
27
4
7
3
3
2
1
1
4

Recently viewed by you

Why students choose Stuvia

Created by fellow students, verified by reviews

Quality you can trust: written by students who passed their tests and reviewed by others who've used these notes.

Didn't get what you expected? Choose another document

No worries! You can instantly pick a different document that better fits what you're looking for.

Pay as you like, start learning right away

No subscription, no commitments. Pay the way you're used to via credit card and download your PDF document instantly.

Student with book image

“Bought, downloaded, and aced it. It really can be that simple.”

Alisha Student

Working on your references?

Create accurate citations in APA, MLA and Harvard with our free citation generator.

Working on your references?

Frequently asked questions