Improving Simulation Evaluation Effectiveness During COVID-19

Simulation-based learning (SBL) has become a mainstay of undergraduate nursing programs across the nation. Many schools are utilizing healthcare simulation to replace clinical hours, augment didactic course content, and increase student engagement across the curriculum. SBL activities provide a valuable opportunity to control and manipulate the environment to ensure learners have the opportunity to achieve certain outcomes. This article highlights a nursing school’s approach to intentional, time-sensitive evaluations of simulation-based learning (SBL) activities for learners and facilitators.

Evaluations stand at the core of SBL in identifying gaps, assessing progress, and adjusting content where needed to improve learner outcomes. Utilizing learner and facilitator feedback from formal evaluations can improve both teaching and learning and in turn, positively impact future SBL activities. This methodology of simulation effectiveness can be applied in-person or virtually, as many programs are transitioning back to in-person instruction, as COVID-19 cases are decreasing while vaccinations are increasing.

Simulation effectiveness is dependent on feedback from evaluations of learners, facilitators, and the program as a whole. Clinical simulations can be improved quickly and efficiently if the evaluation process is defined and consistent. The importance of sustainable feedback obtained from learner evaluation of SBL activities have proved to be effective in making changes to these activities for future learners.

Sponsored Content:

In the spring of 2020, the world came to a screeching halt due to the COVID-19 pandemic. Schools across the nation were forced into remote learning for the safety of learners, facilitators, and patients alike. Programs were forced to go completely remote and rely heavily on healthcare simulation and virtual learning activities. Many were so accustomed to having everything needed right at their fingertips: resources, learners, and each other; but now, everything took place through a computer monitor.

During the pandemic, facilitators were forced to transition to a screen-based format by learning to evaluate clinical simulation and obtain learner feedback. With the increase in the utilization of SBL, the importance of accurate and appropriate evaluation of these activities has also increased. Evaluations must go beyond the satisfaction and confidence level of the learner, especially during these unprecedented times, to include the ability to meet the stated objectives, improve practice, and identify any knowledge gaps of the learner.

Evaluations stand at the core of identifying gaps, assessing progress, and adjusting content as needed to improve learner outcomes. Some improvements may be as simple as changing the number of learners per group because feedback stated “there were too many learners to perform the needed tasks.”

Other feedback may state “the simulation was overwhelming” and learners did not feel prepared. This takes a more in-depth analysis into why the learners felt overwhelmed, and how or when the content was taught in relation to the simulation. As a result, the simulation may be placed later in the semester, or key objectives of the simulation may be taught more directly in the classroom.

Sponsored Content:

The importance of sustainable feedback obtained from student evaluation of SBL activities has proven to be effective in making changes to these activities for future learners. Utilizing student and facilitator feedback from formal evaluations can improve both teaching and learning and in turn, positively impact future SBL activities.

Reducing Learner Resistance to Remote Learning

Learners were discouraged when in-person simulations and clinical studies completely transitioned to remote learning. Despite the pandemic, learners still completed their pre-work, showed up virtually to their assigned screen-based simulations, and participated in pre-briefing and debriefing activities. However, when it came time to complete their remote simulation evaluations, learners were less motivated to complete these.

Traditionally, with onsite simulations, learners arrived at the simulation facility, received a pre-briefing, then participated in an SBL experience followed by a structured debriefing. Once the debriefing was completed, learners were given a QR code to complete an electronic evaluation using their electronic device. During this time, the debriefer would step out of the room to give the learners privacy to complete the evaluations and not feel rushed. Once the learners completed the evaluation, a “thank you for completing this evaluation” screen was displayed indicating the evaluation was complete, which was a signal to exit the simulation.

At the beginning of the pandemic, for screen-based simulations, once the activity was over, faculty emailed the evaluation link to the learners, which they were strongly encouraged to complete. The initial response rate was around 50% of the class, which is less than desirable. Poor response rates do not lend themselves to an accurate and appropriate evaluation of the simulation activities. Simulation educators decided to try a different approach, much like what would occur during in-person simulation.

Another problem identified involved simulation evaluation by facilitators. Learners, as well as simulation facilitators, should be allowed to evaluate SBL activities. Many times, following in-person simulations, facilitators would make comments on things that could be improved upon, or even comment on where learners seemed to struggle and ways to improve that particular issue. Unfortunately, there was no way to capture this feedback at that moment.

Virtual Learning Solutions and Lessons Learned

Student evaluation: At the end of each screen-based simulation, learners participated in an on-screen debriefing, much like an in-person session. Once the debriefing was complete, learners were then shown an on-screen QR code via screen sharing, which linked them to an anonymous simulation evaluation survey (using Qualtrics). If learners had issues scanning the on-screen QR code, they were provided with a link to the survey in the chat box. Learners were required to show the facilitator their “thank you screen” before exiting the session.

Once the verification was confirmed by the facilitator, facilitators called each learner by name, acknowledging their evaluation was complete, at which time they were allowed to exit the session. Following the initiation of this approach to evaluation, response rates returned to 100%. This solution will continue following the pandemic, and be applied to all future SBL activities.

Learn More About Simulation Evaluation

Today’s article was guest authored by Tiffani Chidume, DNP, RN, CCRN-K, CHSE, an assistant clinical professor and simulation center coordinator, and Amy Curtis, PhD, RN, CHSE, an assistant clinical professor.

Have a story to share with the global healthcare simulation community? Submit your simulation news and resources here!

Sponsored Content: