A simulation of a system is the operation of a model of the system; “Simulation Model”. The steps involved in developing a simulation model, designing a simulation experiment, and performing simulation analysis are: 
- Step 1. Identify the Problem: Enumerate problems with an existing system. Produce requirements for a proposed system.
- Step 2. Formulate the Problem: Select the bounds of the system, the problem or a part thereof, to be studied. Define overall objective of the study and a few specific issues to be addressed. Define performance measures – quantitative criteria on the basis of which different system configurations will be compared and ranked. Identify, briefly at this stage, the configurations of interest and formulate hypotheses about system performance. Decide the time frame of the study. Identify the end-user of the simulation model.
- Step 3. Collect and Process Real System Data: Collect data on system specifications, input variables, as well as the performance of the existing system.
- Step 4. Formulate and Develop a Model: Develop schematics and network diagrams of the system. Translate these conceptual models to simulation software acceptable form. Verify that the simulation model executes as intended. Verification techniques include traces, varying input parameters over their acceptable range and checking the output, substituting constants for random variables and manually checking results, and animation.
- Step 5. Validate the Model: Compare the model’s performance under known conditions with the performance of the real system. Perform statistical inference tests and get the model examined by system experts. Assess the confidence that the end-user places on the model and address problems if any.
- Step 6. Document Model for Future Use: Document objectives, assumptions and input variables in detail. Document the experimental design.
- Step 7. Select Appropriate Experimental Design: Select a performance measure, a few input variables that are likely to influence it, and the levels of each input variable. Generally, in stationary systems, the steady-state behavior of the response variable is of interest. Ascertain whether a terminating or a nonterminating simulation run is appropriate. Select the run length. Select appropriate starting conditions. Select the length of the warm-up period, if required. Decide the number of independent runs – each run uses a different random number stream and the same starting conditions – by considering output data sample size. The sample size must be large enough (at least 3-5 runs for each configuration) to provide the required confidence in the performance measure estimates. Alternately, use common random numbers to compare alternative configurations by using a separate random number stream for each sampling process in a configuration. Identify output data most likely to be correlated.
- Step 8. Establish Experimental Conditions for Runs: Address the question of obtaining accurate information and the most information from each run. Determine if the system is stationary (performance measure does not change over time) or non-stationary (performance measure changes over time).
- Step 9. Perform Simulation Runs: Perform runs according to steps 7-8 above.
- Step 10. Interpret and Present Results: Compute numerical estimates (e.g., mean, confidence intervals) of the desired performance measure for each configuration of interest. Test hypotheses about system performance. Construct graphical displays (e.g., pie charts, histograms) of the output data. Document results and conclusions.
- Step 11. Recommend Further Courses of Action: This may include further experiments to increase the precision and reduce the bias of estimators, to perform sensitivity analyses, etc.
Although this is a logical ordering of steps in a simulation study, many iterations at various sub-stages may be required before the objectives of a simulation study are achieved. Not all the steps may be possible and/or required. On the other hand, additional steps may have to be performed. 
For more detailed information see: White Paper – Introduction to Modeling and Simulation by Anu Maria
AcqLinks and References:
- Modeling & Simulation Guidance for the Acquisition Workforce – Oct 208
- DoD Instruction 5000.61 “DoD M&S Verification, Validation, and Accreditation (VV&A)” – 9 Dec 2009
- DoD 5000.59 “DoD Modeling and Simulation (M&S) Management” – 8 Aug 2007
- DoD “M&S Body of Knowledge (BOK)” – June 2008
- DAU “Test and Evaluation Management Guide” – Chapter 14 – Jan 2005
- SMC “Systems Engineering Handbook” – 15 Jan 2004
- DoD 5000.59-M “M&S Glossary” – Jan 1998
- DoD “M&S Glossary” – 1 Oct 2011
- Acquisition Modeling and Simulation Master Plan – 17 April 2006
- White Paper: JHU APL “Best Practices for the Development of M&A” – June 10
- White Paper: Introduction to Modeling and Simulation by Anu Maria
- White Paper: Introduction to Modeling and Simulation
- Presentation: Manager’s Guide to the High Level Architecture (HLA) for M&S – 11 May 2009
- Website: Air Force Agency for Modeling and Simulation
- Website: Army Modeling and Simulation Office
- Website: DoD M&S Coordination Office (MSCO)
- Website: DoD M&S Catalog
- Website: Simulate Interoperability Standards Organization (SISO)