predictive analytics
Predictive Analytics

What is Simulation?

Introduction

Simulation software and methodology have been successfully used in supply chain, logistics, and manufacturing for many years and will continue to play a major role in these industries. The drive for more efficient operations, producing more with less, and constantly increasing and refining quality has been the driving force in adopting simulation software. Moreover, the heavy shift toward lean thinking and six sigma methods coupled with increasing operational variability have forced many companies to introduce dynamic predictive technologies to be an integral part of their daily operation procedures.

 

The healthcare industry, from hospitals and clinics to labs and blood centers, can also benefit from the dynamic predictive technology. The type of technology and its scalability play a major role in the implementation success and level of return on investment achieved.

This white paper outlines steps and methods used to successfully select the correct tool in order to achieve simulation success

 

Problem Statement

The healthcare industry is currently faced with an increasing influx of patients, a reduction in revenue, and shortage of skilled labor while maintaining a high level of care quality. Although some processes are well defined, this industry suffers from a high variability, seasonality, and provider driven schedules.

Lean and Six Sigma methods are considered to be the best options in order to implement more efficient operations and increase the operational capacity without affecting the level of care or requiring major investment in new infrastructure. Unfortunately, these methods require change and need proper planning and analysis to be implemented successfully in any environment. Change and helping people accept change is another hurdle facing all process improvement initiatives.

 

Simulation Software

Simulation software required for process analytics are, in general, discrete event simulators that allow the user to create a virtual process replica of the current operation and can be grouped into 2 main categories; Static and Dynamic tools. The main differences between them are in model building, simulation interaction, analysis and connectivity.

 

Model Building

Static simulation tools are normally code heavy, meaning a computer program needs to be written (in C++, C#, VB, or other proprietary language), and compiled in order to build the model. Some static tools have developed basic visual code generation environment to get the model started, yet expanding the model to anything useful will require modifying the code and adding more control logic.

Dynamic simulation software on the other hand are the exact opposite in that model building is done graphically with almost no reliance on code. Their model building environment does not generate code, but interacts directly with the simulation engine. The main advantage is that users do not have to be programmers, but need to have enough knowledge of the operation in order to create the model. Simulation is not anymore controlled by the select few, but by every enabler in the organization.

 

Simulation Interaction

Static tools provide no interaction with the simulator. In other words the simulation runs on the engine and the results are provided after the run is completed along with the animation. Some static tools have provided animation during the model run, yet have no ability to physically interact with the model. This is an inherent design issue with static tools due to their reliance on compiled code before the model is run on the simulation engine

Dynamic simulation tools provide a game like interaction with the simulated environment. As the simulation is running, the user has the ability to modify constraints, on the fly, and immediately visualize the effect of the change. Adding stations, modifying staff, changing buffers and other constraints modification impact on the flow can be quickly visualized and analyzed.

 

Analysis

Static simulation tools provide analysis and reporting after the simulation run has completed with no data feedback during. In addition, special care must be taken in order to identify the metrics required from the simulation run, if a metric was not collected and coded in, the user must rerun the simulation.

Dynamic tools provide the analysis on the fly as the simulation is progressing, allowing the user to quickly identify problems, bottlenecks and other constraint limitation as the simulation develops. In addition, adding metrics tracking to the run can be made at any time during the simulation run. Introducing and analyzing variability into the model is fast and effective, saving valuable time in achieving the optimum solution.

 

Connectivity

Static simulation tools, in general, connect to external data sources before the simulation starts. Analysis is then performed on the loaded data only. This specific scenario introduces limitations to the amount of data that can be imported into the model.

Dynamic simulation tools interact with external systems as the simulation develops. EMR connectivity and access to large data sets to study the effect of change over a year for example is not anymore limited to the user workstation. The simulation model can read and write data to external data sources using tightly integrated environment.

 

Other factors

The most important difference between the 2 environments is that Static Simulation tools require programmers or highly trained individuals to build and analyze the model, in addition to requiring an extending amount of time to build and validate the model. On the other hand, Dynamic Simulation tools can be used by any user with working knowledge of the process and a creative mind for change. Validating, analyzing, and presenting the model is performed with minimum effort leaving more the user with more time to generate a new optimum solution.

With the simulation environment becoming an integral part of the process improvement and lean initiative of the organization, empowering the people to perform the simulation analysis is more beneficial, short and long term, than involving individuals with limited knowledge of the operation to code the simulation models. Moreover, additional time spent on coding and validating the model is lost revenue that remains uncollected.

Selecting the correct simulation tool goes beyond the initial purchase price, it is important to investigate all aspects of the software including its dynamic capabilities, analysis and most of all its connectivity to external data sources. Unfortunately, sales people tend to promise too much during the initial meeting, it is highly recommended that the user requests that models be built live with minimum preparation and requests analysis and connectivity to be demonstrated. Failure to do so will result in investing in the wrong tool that will not be used to its full potential benefits.

 

Conclusion

In conclusion, dynamic simulation tools seem to provide a better modeling and analysis environment than static tools. External data connectivity and ease of model building are key issues in selecting and identifying the simulation tool to use. Regardless of which tool you select, a learning curve is required in order the make the best out of the simulation environment. Selecting a tool with the shortest learning curve that can be effectively used by the current subject matter expert is critical. Needless to say, explore future support options and support charges, saving a few dollars in the price of the software may result in huge expenditures for support help later in the process.

Print
18030 Rate this article:
3.0
Comments are only visible to subscribers.