Friday 28 November 2014

Throwing out Babies with the Bathwater: Guidelines for Simulation (Based) Research.

A number of recent publications about the state of the science in simulation research, as well as framework proposing ways to synchronize simulation research led to what would be the obvious next step: development of standardized reporting guidlines for simulation research. 

This step has been very recently undertaken by the INSPIRE network, which posted a call for participation in a collaborative project with the goal to develop standardized reporting guidelines for simulation-based research (see Haji et al., 2013 for an alternative view on the use of "based").  A guideline is a statement by which to determine a course of action and a structure put in place to streamline a particular processes according to a set routine or sound practice. 

Curently, there are numerous reporting guidelines (CONSORT, STROBE, SQUIRE etc) for various types of studies.  Unfortunately, a reporting guideline does not exist for simulation research.  As a result, the nature of how simulation studies are reported is highly variable. By developing a standardize reporting guideline, INSPIRE is hoping to raise the bar for simulation research.

I usually do not take the "devil's advocate" position, but in this case I will.  The proposal calls for guidelines for standardized reporting for simulation research.  The questions I pose are: Are we ready for such guidelines?  Education, which simulation is a part of, is a complex system, which needs to be researched using multiple lenses, methodologies, experimental paradigms.  Do we know what methods are best to research this complex system? Are we running into the danger of following , perhaps prematurely formed guidelines and thus not accepting unorthodox approaches that do not fit with the guidelines?  Throwing a babies with the bathwater.

If anyone is wishing to participate in the INSPIRE project feel free to contact me and I will relay the information.  I would also like to read what our community has to say about this initiative.  Perfect timing? Premature? If developed, these guidlines will have a very direct effect on what we do, how we do it, and how we report on what we did? Thankfully, by definition following a guideline is never mandatory as guidelines are not binding and are not enforced.  Perhaps this guideline building exercise will lead all of us to a better understating of where we currently stand in simulation research.

I think it is worth having a conversation about...

Video Based Debriefings: Can Simulation Learn from Sports?

In sports, the margin between winning and losing is very narrow.  

A team must execute a winning strategy in a matter of seconds and players must make real-time decisions that determine the outcome of a game.  Coaches develop these strategies and abilities in their teams through carefully analyzing past performance in training and competition.  

Many simulation practitioners are beginning to use similar models of performance analysis in an effort to improve outcomes.

In sports players’ physical skills have been perfected over thousands of hours of practice and game time.  What separates winning and losing teams are not only physical skills and athleticism, but cognitive and communication skills executed under intense pressure.

The best coaches are the ones who prepare their teams to analyze and assess relevant game factors, make the correct decision at the right time, and execute.  They make certain their players explicitly understand expected performance measures.   Players must communicate effectively, accept and perform assigned roles, working as a team toward the desired outcome.  

In order to minimize errors, a coach will spend hours analyzing the performance of their own team. They may, for example, look for breakdowns in communication and execution, searching for ways to eliminate them. 

Over the last 50 years, coaches have relied on advances in technology to improve performance analysis.  During the last 10 years, SportsCode, developed by Sportstec, has been embraced at the highest levels of sport. This and similar technologies enable coaches to annotate or “code” their observations into game footage for the purposes of analyzing performance, determining opponents’ tendencies, and generating video evidence of best practices. 

Globally, many practitioners of simulation have embraced a similar analysis model.  Educators in the health care and marine training sectors are beginning to use similar technologies (for example Studiocode) for exactly the same reasons as coaches – to identify and reduce risks and to improve educational processes.  The key analysis is the same the environments differ. 

Healthcare professionals as well as off shore workers face more varied and serious situations than do sports coaches, despite what the fans may argue.  The goal of simulation is to minimize risk to safety by improving team and individual skills.  

The simulation educator is the Head Coach.

Coaches minimize risk to give their teams the best chance to win.  The goals of simulation practitioners are similar to those of sports coaches; immerse learners in critical scenarios, provide several variables that are possible to occur and train the team to respond correctly.

Simon Reynolds