The Institute of Medicine (IOM) published To Err is Human 24 years ago with a call for the healthcare community to focus on patient safety. Despite widespread attention to the report, improvements in quality have been mixed. Because today is World Patient Safety Day, it is important to review the past and propose some solutions going forward.
Many of the quality measures included in the CMS value-based purchasing programs, such as hospital acquired infections (HAIs), demonstrated improvement up until the pandemic. The 2018 National Impact Assessment Report evaluated CMS quality measures from 2006-2015 and demonstrated national performance trends were improving for 60% of the measures analyzed, and were stable for about 31%.
On the other hand, David W. Bates, MD, and colleagues published a retrospective analysis of data during calendar year 2018 and found that an adverse event occurred in nearly one in four admissions, and close to one-fourth of these events were preventable. The rates of error were similar to the findings in the original IOM report, although the type of patients and procedures were likely more complex in the more recent time period.
During the COVID-19 public health emergency (PHE), one of us (Fleisher) co-authored an editorial with colleagues from CMS and CDC showing that quality measures related to safety in general and HAIs specifically had acutely declined. The 2024 National Impact Assessment of CMS Quality Measures Report demonstrated similar findings of worsening quality measures in 2020 and 2021.
With the ending of the PHE, the Leapfrog Group reported variable but continuous improvement in HAIs over the past several years, although none of the measures returned to baseline. A new report from American Hospital Association (AHA) and Vizient similarly suggests that hospital performance overall is now better than pre-pandemic, and acknowledges that while progress is good, the work must continue. This has led multiple groups to call for a re-dedication to the basics of patient safety, including the President’s Council of Advisors on Science and Technology (PCAST).
While all healthcare providers are dedicated to delivering the safest care, how can this be accomplished given the documented shortages in the healthcare workforce and widespread reports of clinician burnout?
The Old Solutions Aren’t Working
As a clinician and a researcher in patient safety, we believe that the old solutions to improve patient safety in the hospital setting will not succeed in achieving the desired goal. We need to develop new strategies to achieve patient safety goals within the context of the current reality of care delivery — overworked clinical staff who are still dealing with the effects of COVID, electronic health records, increased workplace violence, and overall work burden, along with the potential benefits of, and concerns regarding harm from, embedding artificial intelligence (AI) into the workflow.
Despite the clear motivation to promote accountability, we are concerned about unintended consequences of well-intentioned efforts. Notably, advocating holding hospitals accountable can translate into increased pressure on the workforce and reduced willingness to report problems, further reducing patient safety. A crucial element to achieve optimal care while minimizing healthcare-related harms is to ensure that clinical staff feel safe to speak up with questions, concerns, and mistakes, in order to interrupt the link between error and patient harm. A climate of psychological safety, where staff understand the implications of failure and appreciate the need to speak up to continuously learn and improve, is thus foundational to making this happen.
We need to rethink how to improve safety in the post-pandemic world, which includes employing the tenets of implementation science and learning how to adapt prior learnings from safety science to today’s reality. This will require a willingness to “fail intelligently” in order to innovate and learn. In this context, psychological safety is particularly important. Being willing to try something new, in a safe and controlled setting, is critical in the context of the high-stakes field of medicine.
Several new programs have been developed to disseminate the learnings from these “controlled trials,” including the work of Mary Dixon Woods at THIS Institute and the AHA’s Patient Safety Initiative. It will not be as easy as implementing a strategy that worked in the literature or simply adopting strategies that have worked in other industries.
Priority Areas for Cultivating Patient Safety
We see three high priority areas of patient safety that warrant incorporating psychological safety and failing intelligently into their implementation.
Teamwork has been a key hallmark of patient safety. Prior to the PHE, the research on patient safety focused on developing teams and using tools, including checklists, to focus the team on specific tasks previously shown to improve safety. Given the current reality of less stable clinical care teams, a loss of senior clinicians and a change in the way healthcare workers think about their jobs (exemplified by the rise of unionization among healthcare workers), we need both a more systematic analysis and greater experimentation to solve the current problems. We must be willing to ask questions and not just trust that interventions that worked in one setting will work everywhere.
A second priority area is the use of technology, as advocated by PCAST. Among the many recommendations, the PCAST report suggested that AI could improve patient safety. However, previous technological implementations have shown both positive and negative effects on safety.
For example, the evidence on the effects of computerized provider order entry (CPOE) to reduce medication errors is mixed — it can reduce some errors but may cause others. The WHO Surgical Checklist has also demonstrated variable outcomes depending upon the setting. As we deploy new technologies that have the potential to improve safety, we need to pilot that deployment and continue to monitor and refine the tools and their use over time at the local institutional level.
Third, many groups have advocated the deployment of principles of high reliability organizations (HRO). Using tenets from the military, HRO has been advocated by The Joint Commission. Leahora Rotteau, PhD, and colleagues, with an accompanying editorial by Christopher Myers, PhD, and Kathleen Sutcliffe, PhD, laid out concerns related to the transferability of the high reliability framework to healthcare.
Within this context, it is the role of leadership in the middle — the clinicians who are leaders on the wards of our hospitals — to define and advance this new reality for healthcare safety. They must ensure that we explore how our “old” tools are currently working and deploy “new” tools, while being willing to learn from the inevitable failures that accompany novelty and experimentation to define the best strategies. Learning from failure is dependent on psychological safety and will be critical for achieving our ultimate goal — enabling the healthcare system to reduce patient harm and improve patient outcomes.
We need to stop assuming our past safety strategies will work and new technologies will definitely make us safer. We need to start experimenting again with new ways to enhance patient safety. We need to do this while being willing to use our failures to learn and create a new, safer system.
Lee A. Fleisher, MD, ML, is professor emeritus of Anesthesiology and Critical Care at the University of Pennsylvania Perelman School of Medicine, and a visiting fellow at the Duke-Margolis Institute of Health Policy. He was the former chief medical officer and director of the Center for Clinical Standards and Quality at CMS. Amy C. Edmondson, PhD, is Novartis professor of Leadership and Management at Harvard Business School.
Source link : https://www.medpagetoday.com/opinion/second-opinions/111992
Author :
Publish date : 2024-09-17 17:31:12
Copyright for syndicated content belongs to the linked Source.