Simulation Experiments in Practice : Statistical Design and Regression Analysis

Research, Preprint OPEN
Kleijnen, J.P.C. (2007)
  • Publisher: Operations research
  • Subject: metamodel;experimental design;jackknife;bootstrap;common random numbers;validation | metamodels; experimental designs; generalized least squares; multivariate analysis; normality; jackknife; bootstrap; heteroscedasticity; common random numbers; validation | metamodel; experimental design; jackknife; bootstrap; common random numbers; validation | metamodels;experimental designs;generalized least squares;multivariate analysis;normality;jackknife;bootstrap;heteroscedasticity;common random numbers;validation
    • jel: jel:C0 | jel:C1 | jel:C9

In practice, simulation analysts often change only one factor at a time, and use graphical analysis of the resulting Input/Output (I/O) data. The goal of this article is to change these traditional, naïve methods of design and analysis, because statistical theory proves that more information is obtained when applying Design Of Experiments (DOE) and linear regression analysis. Unfortunately, classic DOE and regression analysis assume a single simulation response that is normally and independently distributed with a constant variance; moreover, the regression (meta)model of the simulation model’s I/O behaviour is assumed to have residuals with zero means. This article addresses the following practical questions: (i) How realistic are these assumptions, in practice? (ii) How can these assumptions be tested? (iii) If assumptions are violated, can the simulation's I/O data be transformed such that the assumptions do hold? (iv) If not, which alternative statistical methods can then be applied?
Share - Bookmark