However, in view of the possibility of human error or changes in medical sciences, neither the author nor the publisher or any other party involved in the preparation or publication of this work warrants that the information contained herein is accurate or complete in every respect, and they disclaim all responsibility for any errors or omissions or for the results obtained from the use of the information contained in this work. University of California, San Francisco Chief of the Medical Service Chair of the Patient Safety Committee,. For more information, please contact George Hoare, Special Sales, at [email protected] or.
Except as permitted under the Copyright Act of 1976 and the right to store and retrieve one copy of the work, you may not decompile, disassemble, reverse engineer, reproduce, modify, create derivative works based on, transmit, distribute, distribute, sell, publish or sublicense the work or any part thereof without McGraw-Hill's prior permission. McGraw-Hill and its licensors do not warrant or guarantee that the functions contained in the work will meet your requirements or that their operation will be uninterrupted or error-free. Neither McGraw-Hill nor its licensors shall be liable to you or anyone else for any inaccuracy, error or omission, however caused, in the work or for any damages arising therefrom.
McGraw-Hill has no responsibility for the content of any information accessed through the Work. In no event shall McGraw-Hill and/or its licensors be liable for any indirect, incidental, special, punitive, consequential or similar damages resulting from the use or inability to use the work, even if any of they have been advised of the possibility of such damages.
Reporting Systems, Incident
Investigations, and Other Methods
Strategies for connecting senior management with frontline staff 241 Strategies for generating frontline activity to improve safety 243 Dealing with major errors and on-call events 244. Qualifications and training of the patient safety officer 245 The role of the patient safety committee 245.
APPENDICES
In the introduction, I will describe the epidemiology of error, distinguish safety from quality, and discuss key mental models that shape our contemporary understanding of the field of patient safety. Finally, although most preventable adverse events involve errors, not all do (see “Error Measurement Challenges and Safety” below). At XYZ Hospital, the patient safety officer became concerned about the frequency of medication errors.
The modern patient safety movement began in late 1999 with the publication of the Institute of Medicine's report on medical errors, To Err is Human: Building a Safer Health System. approach, known as systems thinking. James Reason's Swiss Cheese Model of Organizational Accidents. The analysis is from "The Wrong Patient" case in Chapter 15.
Although we now understand that the root cause of hundreds of thousands of errors each year lies in the "open end," the proximate cause is often an action performed (or neglected, or performed incorrectly) by a caregiver. Quality of care is defined by the Institute of Medicine (IOM) as "the degree to which health services for individuals and populations increase the likelihood of desired health outcomes and are consistent with current professional knowledge." In its influential 2001 report, Crossing the Quality Chasm, the IOM advanced six goals for a quality health care system (Table 3–1): patient safety, patient-centeredness, effectiveness, efficiency, timeliness, and equity. .1 Note that safety is described as one of the six components, essentially making it a subset of quality.
For QI practices that require predictable repetition, attempts to "hardwire" the practice or use alternative providers that focus on the activity are often beneficial. At first glance, a signature on your place - the surgeon marks the site of the operation in indelible ink after checking the correct place - seems a particularly strong solution. Process: Active communication between all members of the operations/procedures team, consistently initiated by a designated member of the team, conducted in a "fail-safe" mode, i.e. the procedure is not initiated until any questions or issues have been resolved.
Once the operation was complete, the nurse simply picked up the sponges by picking up the rings—the way a fishhook trail is pulled out of the water by winding the line. For example, an experienced physician, presented with the case of a 57-year-old man with chest pain, shortness of breath, and dizziness for 3 days, responds by thinking, "The worst thing that could happen is a heart attack or a blood clot in lungs. Could the conditions in the room – layout, lighting, ambient noise or the nurse's workload – have affected the outcome.
The radiologist's report read: 'The nodule is markedly enlarged compared to that on the October X-ray. That was the first Silber's GP heard of the earlier X-ray, and it was too late. The fog was thick and there was no ground radar in 1977 to signal to the flight crew whether the runway was clear - the crew relied on their own eyes or those of the air traffic controllers. Some in the healthcare industry point to our flexible teams—the fact that a surgeon is likely to work with a different set of nurses, technicians, and perfusionists each day—as an additional obstacle to improving teamwork.
This is known in the military and air force as a debriefing, and involves all team members taking a moment at the end of the process to explicitly, without judgment, discuss what went wrong and what went right.5 The lessons from these debriefings are often invaluable. , and just as importantly, the meetings reinforce the value of group behavior, the critical importance of speaking up, and the fact that everyone—including the leader—is fallible. Orders must be processed instantly (none of the "system down due to scheduled maintenance" or "orders processed next business day" so familiar from commercial transactions). As a patient moves across silos—from clinic to hospital, from state to state, from hospital to hospice—it's unlikely that the information they need will move with them.
This screen highlights physiological changes that occur (in this case, a rapid heartbeat and a trend toward increasing pulse rate and falling blood pressure [BP]); such monitoring can help clinicians detect and respond to such changes before an adverse event occurs. to the patient's active medical condition. While you may see all this as terribly exciting – and it is – the automation of healthcare is also fraught with challenges. Incident reports come from frontline personnel (for example, the nurse, pharmacist, or doctor who cares for a patient when a medication error has occurred) rather than, for example, supervisors.
At this writing, for example, more than half of the states in the United States have implemented mandatory reporting programs that require certain hospital errors to be reported.4 The state of Pennsylvania has a system that requires hospitals to report all "serious events," "incidents," and hospital-acquired infections. The root cause analysis (RCA) technique involves a deliberate, comprehensive dissection of an error, highlighting all relevant facts but diligently searching for causes underlying ("root") rather than settling for simple explanations (such as "the doctor pulled out the wrong chart" or "the pharmacist stocked the wrong drug").In their masterful discussion of this case9 in our Quality series Grand Rounds on Medical Errors (Appendix I), Chassin and Becher coined the term “culture of low expectations.” Reflecting on the actions—or inactions—of the nurse and resident, they wrote,.
One aspect of a safe culture is combating the "culture of low expectations," where workers assume miscommunication and therefore fail to double-check despite clear warning signs.