Weighted Monte Carlo and pricing American options
This dissertation studies (1) general weighted Monte Carlo (WMC) techniques, (2) the connection between the weighted Monte Carlo and a least-squares regression method for pricing American options, and (3) convergence results for regression-based methods.
We analyze a class of simulation estimators that are weighted averages of independent, identically distributed (i.i.d.) replications in which the weights are chosen to constrain the weighted averages of auxiliary variables that serve as controls. These estimators are referred to as WMC estimators for which we give large sample properties. Depending on whether the weighted averages of the controls are constrained to their population means or some other values, we distinguish two cases: the unbiased and the biased. In the unbiased case, the objective of WMC is variance reduction. While in the biased case, the purpose is to correct for model error. In this case, different objectives will lead WMC estimators to converge to different values, corresponding to different ways to adjust or correct a model.
A particular example of a WMC estimator is the linear control variate estimator which has an alternative interpretation as the fitted value on the regression line at the population means of the controls. This observation motivates us to examine the connection between the WMC technique in Broadie et al.  and the least-squares regression method in Tsitsiklis and Van Roy  for pricing American options. Under certain conditions, we show that the WMC technique is equivalent to a least-squares method and will lead to a less-dispersed estimator.
At last, we give convergence results for approximate continuation values in the setting where the number of paths generated and the number of basis functions for regression increase together. In particular, we consider the underlying process being either Brownian motion or geometric Brownian motion, and regress continuation values against polynomials. We show that the number of paths must grow exponentially with the number of basis functions to ensure convergence to the correct value in the Brownian motion setting, and much faster in the geometric Brownian motion setting.