We study the approximation of $\mathbb{E}f(X_{T})$ by a Monte Carlo algorithm, where X is the solution of a stochastic differential equation and f is a given function. We introduce a new variance reduction method, which can be viewed as a statistical analogue of Romberg extrapolation method. Namely, we use two Euler schemes with steps δ and δβ,0<β<1. This leads to an algorithm which, for a given level of the statistical error, has a complexity significantly lower than the complexity of the standard Monte Carlo method. We analyze the asymptotic error of this algorithm in the context of general (possibly degenerate) diffusions. In order to find the optimal β (which turns out to be β=1/2), we establish a central limit type theorem, based on a result of Jacod and Protter for the asymptotic distribution of the error in the Euler scheme. We test our method on various examples. In particular, we adapt it to Asian options. In this setting, we have a CLT and, as a by-product, an explicit expansion of the discretization error.