In this session, we will review how changing our auditing tool and methodology used for routine peer review helped develop a safe learning environment, increase practitioner engagement, and proactively identify system errors and unsafe conditions in an outpatient telehealth setting.
Tool: Practitioner Audit Framework
Problem: We observed that peers would score domains as “Met” despite having feedback for improvement. We hypothesized that practitioners were hesitant to mark standards as “Not met” due to the percentage scoring algorithm. This meant that we were not able to easily observe trends in the domains that had the most opportunity for improvement
Tool Selection: After conducting industry research, we decided to use the CodeSmodel, originally implemented at Riverside University Health System-Medical Center (1) to move away from traditional percentage scoring with only Met/Not met responses and (2) to get more feedback/data on potential system errors. Our goal was to highlight platform/care improvements over punitive measures to improve patient safety, reduce errors and promote improvement
Usage: Each practitioner on the platform is evaluated monthly and given direct feedback on opportunities for improvement. If the committee identifies common trends or opportunities within an audit element, findings are communicated to practice leadership, and additional training/education is provided. If a system error is identified, the technical team reviews for root cause analysis, and action
Results: Through these simple changes to the audit tool, we have successfully increased practitioner participation and peer feedback. Qualitatively, we have identified a 7% increase in system improvement opportunities; hence, enhancing Just Culture and reducing the likelihood of human error. Our success was achieved by effectively analyzing data, implementing improvement initiatives, and assessing the impact of enhancements and outcomes