Train Safely with Elevate Healthcare

Healthcare Simulation Tools

Healthcare simulation tools are crucial to improve clinical education, training, patient care, research, and operational outcomes. The use of metrics, measurement, and tools is an essential component of a comprehensive Simulation Program Evaluation. Think of metrics as the yardstick by which one measures simulation program success. Metrics can also refer to the standards of measurement by which a program measures efficiency, operational and educational program performance, and the progress of a specific initiative, process, or product. This HealthySimulation.com resource page covers various tools supportive of healthcare simulation across a multitude of simulation program departments and processes.

Research in clinical simulation has grown to evaluate more than learner preference, satisfaction, and self-confidence. Measurement becomes the way researchers understand the outcomes of an intervention. When measuring outcomes, one must recognize the distinction of what is being measured. Measurement includes how healthcare simulation works (on simulation) or using medical simulation to measure something else (with simulation). Measurements can consist of knowledge, skills, and attitudes (KSAs) incorporated into clinical simulation experiences. These can be measured as formative, summative, or high-stakes assessments. When the facilitator provides feedback, the learner can self-reflect on their performance and improve their future practice based on information gained from the simulation-based experience.

HealthySimulation.com hosts Leighton et al. ‘s Evaluating Healthcare Simulation Tools. The evaluation instruments on this page have used psychometric testing to establish the reliability and validity of data when evaluate healthcare simulation outcomes. These instruments are freely available through the links on each page. These instruments were developed for evaluating different aspects of simulation-based education (SBE). All instruments on the page have undergone psychometric testing as valid and reliable evaluation methods for healthcare simulation. HealthySimulation.com is proud to host this page as an extremely valuable resource for healthcare simulationists. The researchers of the healthcare simulation evaluation tools believe evaluation must go well beyond satisfaction and confidence when evaluating SBE as pedagogy. The evaluation instruments have used psychometric testing to establish reliability and validity to obtain the best measures to evaluate healthcare simulation outcomes. 

International Nursing Association for Clinical Simulation and Learning (INACSL) Instruments Used in Clinical Simulation

The INACSL Research Committee had developed a Repository of Instruments used in clinical simulation. The INACSL Research Committee has updated an evidence matrix to aid simulation educators and researchers to understand the history of simulation measures, background testing, known psychometrics, citations, and corresponding author information. This process started in 2019, and each evidence matrix is noted with the last date a particular tool and/or publication was accessed and reviewed. These updates will be available when completed. Though the INACSL Research Committee provided this list of categorized citations, the comprehensiveness of this list or validate any psychometric properties is not guaranteed. Visit the repository of instruments for details of published simulation tools.

Clinical Simulation in Nursing (CSN) is the official journal of INACSL and is included with the INACSL membership. CSN has published several articles on evaluation tools used in nursing simulation. Review these articles for more in-depth information on clinical simulation evaluation tools.

National League of Nursing (NLN) Simulation Tools and Instruments

The NLN has shared four tools and instruments for use in nursing education. These can be used in simulation and experiential learning opportunities. Permission to use these tools and copyright information can be obtained through the NLN website.

  • Educational Practices Questionnaire-Curriculum (EPQ-C) is a 22-item instrument using a five-point scale and designed to measure 1) whether learners agree or disagree that the seven educational practices are present in instructor-developed educational experiences; and 2) the importance of each of the seven educational practices to the learner in the educational experience. Reliability was tested using Cronbach’s alpha. Presence of specific practices = 0.94; importance of specific practices = 0.87. The EPQ-C has a content validity index (CVI) of .90 and aligns with the seven principles of best practices in undergraduate education.
  • The Simulation Design Scale (student version), a 20-item instrument using a five-point scale, was designed to evaluate the five design features of the instructor-developed simulations used in the NLN/Laerdal study. The five design features include: 1) objectives/information; 2) support; 3) problem solving; 4) feedback; 5) fidelity. Content validity was established by ten content experts in simulation development and testing. The instrument’s reliability was tested using Cronbach’s alpha, which was found to be 0.92 for presence of features, and 0.96 for the importance of features.
  • Educational Practices Questionnaire (student version), a 16-item instrument using a five-point scale, was designed to measure whether four educational practices (active learning, collaboration, diverse ways of learning, and high expectations) are present in the instructor-developed simulation, and the importance of each practice to the learner. Reliability was tested using Cronbach’s alpha. Presence of specific practices = 0.86; importance of specific practices = 0.91.
  • Student Satisfaction and Self-Confidence in Learning, a 13-item instrument designed to measure student satisfaction (five items) with the simulation activity and self-confidence in learning (eight items) using a five-point scale. Reliability was tested using Cronbach’s alpha: satisfaction = 0.94; self-confidence = 0.87.

The Center for Medical Simulation (CMS)

The Center for Medical Simulation (CMS) is Harvard’s healthcare simulation training facility that focuses on instructing a broad spectrum of learners and providers. Since the center opened in 1993, CMS has focused on developing teamwork behaviors and communication, collaboration, and crisis management skills, all taught through realistic learning scenarios. The Debriefing Assessment for Simulation in Healthcare (DASH) is designed to assist in evaluating and developing debriefing skills. Debriefing is a conversation among two or more people to review a simulated event or activity in which participants explore, analyze, and synthesize their actions and thought processes, emotional states, and other information to improve performance in real situations. High participant engagement is a hallmark of strong debriefings because it leads to deeper levels of learning and increases the likelihood of transfer to the clinical setting.

The DASH evaluates the strategies and techniques used to conduct debriefings by examining concrete behaviors. The DASH is designed to allow assessment of debriefings from a variety of disciplines and courses, varying numbers of participants, a wide range of educational objectives, and various physical and time constraints. The initial reliability of the DASH-SV had a Cronbach’s alpha coefficient of 0.82.

Creighton Competency Evaluation Instrument (CCEI)

Creighton Competency Evaluation Instrument (C-CEI) has been used to evaluate undergraduate student competency over the past decade. A comprehensive review of the literature associated with the C-SEI and the C-CEI was completed to lay the foundation for future revision of the instrument consistent with the updated AACN Essentials (2021). Both the C-SEI and the C-CEI have demonstrated validity and reliability when used to evaluate students, new graduate nurses, and professional nurses in clinical and simulated learning environments. The C-CEI focuses on 22 general nursing behaviors divided into four categories:

  • Assessment: obtains pertinent data, objective, follow-up, assessment environment
  • Communication: with providers, with patients and significant others, documentation, response to abnormal findings, realism/professionalism
  • Clinical Judgment: interprets vital signs, lab results, and relevant data, prioritizes outcome formulation, intervention performance, and rationale, evaluation of interventions, reflection and delegation
  • Patient Safety: patient identifiers, utilizes standard precautions, safe medication administration, equipment management, technical performance, reflects on hazards and errors

University of California Irvine Medical Education Simulation Center

The UC Irvine Medical Education Simulation Center is a valuable resource for medical simulation, curriculum, and scenario resources. The evaluation tools available on this website measure leader management, teamwork and communication, simulation design, satisfaction and self-confidence, debriefing, and miscellaneous evaluation tools. Evaluation tools available:

  • CALM Instrument
  • Anesthesia Non-Technical Skills (ANTS) Evaluation
  • Non-Technical Skills for Surgeons (NOTSS)
  • Scrub Practitioners Intraoperative Non-Technical Skills (SPLINTS)
  • TeamSTEPPS Teamwork Perception Questionnaire (T-TPQ)
  • Performance Assessment of Communication and Teamwork (PACT)
  • TeamSTEPPS Teamwork Attitudes Questionnaire (TAQ)
  • Objective Structured Assessment of Debriefing (OSAD)
  • Behavioral Assessment Tool (BAT)
  • Positive and Negative Assessment Tool (PANAS)
  • Tool for Resuscitation Assessment Using Computerized Simulation (TRACS)
  • Standardized Patient OSCE Evaluation Form
  • Lasater Clinical Judgment Rubric

Society for Simulation in Healthcare Journal

The Journal of the Society for Simulation in Healthcare (SSH) is a multidisciplinary publication encompassing all areas of applications and research in healthcare simulation technology. The journal is relevant to a broad range of clinical and biomedical specialties, and publishes original basic, clinical, and translational research. The SSH Journal publishes the details and summaries of the SSH Research Summit that occurs prior to or in conjunction with the International Meeting on Simulation in Healthcare (IMSH). There are numerous medical simulation evaluation tools in the journal. The journal is part of a membership in SSH.

Future of Healthcare Simulation and Simulation Evaluation

New modalities and applications for healthcare simulation are currently evolving rapidly. As health simulation professionals, care must be taken to ensure the evaluation of the learning strategies, their impact on learning, how learners perform, the facilitator, and the modality, which requires outcome measurements. New evaluation instruments are developed frequently. As a lifelong learner, the nursing simulation champion must stay abreast of new, valid, and reliable tools. In the meantime, learn more about Leighton et al’s Evaluating Healthcare Simulation tools.

Econo VTA CPR Trainer
Wearable Pressurized Chest Tube Simulator
Helping Save One Million More Lives
UPMC's World Famous Healthcare Simulation Courses: In-Person or Online!