Evaluating Healthcare Simulation Website Offers Free SBE Evaluation Tools
Evaluating Healthcare Simulation is a new website that was created to provide healthcare simulation educators and researchers with freely available instruments (tools) that were developed for evaluating different aspects of simulation-based education (SBE). This new free resource was downloaded 450 times in just the first few weeks!
The use of metrics, measurement, and tools are essential components of comprehensive Simulation Program Evaluation. Think of the metrics as the yardstick by which you will measure program success. Metrics can also refer to the standards of measurement by which a program measures efficiency, operational and educational program performance, the progress of a specific initiative, process, or product. For example, Simulation Program metrics can be expressed as a) scenario quality; b) facilitator evaluation scores, c) customer satisfaction scores; and d) organizational culture readiness, etc. Metrics require the use of tools for measurement and should consist of tools that have been designed specifically for an intended purpose and have established validity and reliability.
Simulation Programs worldwide are working to establish program metrics and use appropriate tools for measurement to support generating quality evidence. Key findings from the National Council of State Boards of Nursing multisite, national study of simulation programs in nursing schools in the United States (Hayden et al., 2014), and the subsequent publication of the NCSBN Simulation Guidelines for Prelicensure Nursing Programs in 2015 (Alexander et al.) highlight the need for programs to engage in comprehensive Simulation Program evaluation.
The NCSBN guidelines (Alexander et al., 2015) address the need for programs to evaluate simulation-based learning experiences using tools that are grounded in the INACSL Standards of Best Practice: Simulation. Program priorities, according to the guidelines should include collecting and retaining evaluation data on facilitator competency, the effectiveness of the learning experience, and also to provide evaluative feedback to educators (Alexander et al., 2015). Moreover, the Society for Simulation in Healthcare Accreditation Standards (SSH, 2016), require that Simulation Programs conduct and provide evidence of higher level, comprehensive program evaluation for accreditation purposes.
If you are managing a Simulation Program and want to strategically implement comprehensive healthcare simulation evaluation using valid and reliable tools for measurement, then go to this new website, Evaluation of Healthcare Simulation. The site offers an online repository of free tools created by a collection of healthcare simulation educators and researchers designed specifically for evaluating various aspects of SBE. Today, simulationists must go beyond measuring satisfaction and participant confidence, to instead engage in comprehensive program evaluation using tools with proven validity and reliability. Evaluating Healthcare Simulation currently features five such tools for use in evaluating the following aspects of a Simulation Program:
- Facilitator Competency Rubric (FCR) to evaluate simulation facilitators based on a novice-to-expert competency scale,
- Simulation Effectiveness Tool – Modified (SET-M) to evaluate the simulated clinical experience,
- Simulation Culture Organizational Readiness Survey (SCORS) to evaluate leadership and organizational culture readiness to integrate simulation into the curriculum,
- Clinical Learning Environment Comparison Survey (CLECS) to evaluate how well learning needs are met in the traditional and simulation undergraduate clinical environments, and the
- ISBAR Interprofessional Communication Rubric (IICR), to evaluate student’s communication with a physician; evaluating and measuring students’ communications.
Today’s article was submitted by Colette Foisy-Doll RN MSN CHSE, Director of MacEwan University Clinical Simulation Centre, and Dr. Kim Leighton PhD RN CHSE CHSOS ANEF FAAN, Curriculum and Instruction Developer, Adtalem – Chamberlain College of Nursing.
- Alexander, M., Durham, C. F., Hooper, J. I., Jeffries, P. R., Goldman, N., Kardong-Edgren, S., . . . Tillman, C. (2015). NCSBN simulation guidelines for prelicensure nursing programs. Journal of Nursing Regulation, 6(3), 39-42.
- Hayden, J. K., Smiley, R. A., Alexander, M., Kardong-Edgren, S., & Jeffries, P. R. (2014). The NCSBN national simulation study: A longitudinal, randomized, controlled study replacing clinical hours with simulation in prelicensure nursing education. Journal of Nursing Regulation, 5(2), C1-S64.
- INACSL Standards Committee. (2016). Standards of Best Practice: SimulationSM. Clinical Simulation in Nursing, 12(Suppl), S1-S50. INACSL Standards Committee (2017, December).
- INACSL standards of best practice: Simulation SM: Operations. Clinical Simulation in Nursing, 13(12), 681-687.
- Society for Simulation in Healthcare. (2016, May). Committee for accreditation of healthcare simulation programs: Core standards and measurement criteria.
Lance Baily, BA, EMT-B, is the Founder & CEO of HealthySimulation.com, which he started while serving as the Director of the Nevada System of Higher Education’s Clinical Simulation Center of Las Vegas back in 2010. Lance is also the Founder and acting Advisor to the Board of SimGHOSTS.org, the world’s only non-profit organization dedicated to supporting professionals operating healthcare simulation technologies. His new co-edited Book: “Comprehensive Healthcare Simulation: Operations, Technology, and Innovative Practice” is available now. Lance’s background also includes serving as a Simulation Technology Specialist for the LA Community College District, EMS fire fighting, Hollywood movie production, rescue diving, and global travel. He lives with his wife Abigail in Las Vegas, Nevada.