Consider a function $f: B \to R_'$, where B is a compact
subset of $R^m$ and consider a "simulation" used to estimate $f(x), x
\epsilon B$ with the following properties: the simulation can switch from one
$x \epsilon B$ to another in zero time, and a simulation at x lasting
t units of time yields a random variable with mean $f(x)$ and variance
$v(x)/t$ . With such a simulation we can divide T units of time into as
many separate simulations as we like. Therefore, in principle we can design an
"experiment" that spends $\tau(A)$ units of time simulating points in each
$A \epsilon \mathscr{B}$, where $\mathscr{B}$ is the Borel $\sigma$-field on
B and $\tau$ is an arbitrary finite measure on $(B, \mathscr{B})$. We
call a design specified by a measure $\tau$ a "generalized design." We
propose an approximation for f based on the data from a generalized
design. When $\tau$ is discrete, the approximation, $\hat{f}$, reduces to a
"Kriging"-like estimator. We study discrete designs in detail, including
asymptotics (as the length of the simulation increases) and a numerical
procedure for finding optimal n-point designs based on a Bayesian
interpretation of $\hat{f}$ . Our main results, however, concern properties of
generalized designs. In particular, we give conditions for integrals of
$\hat{f}$ to be consistent estimates of the corresponding integrals of
f. These conditions are satisfied for a large class of functions,
f , even when $v(x)$ is not known exactly. If f is continuous and
$\tau$ has a density, then consistent estimation of $f(x), x \epsilon B$ is
also possible. Finally, we use the Bayesian interpretation of $\hat{f}$ to
derive a variational problem satisfied by globally optimal designs. The
variational problem always has a solution and we describe a sequence of
n-point designs that approach (with respect to weak convergence) the set
of globally optimal designs. Optimal designs are calculated for some generic
examples. Our numerical studies strongly suggest that optimal designs have
smooth densities.