Date of Award

Spring 1-1-2018

Document Type


Degree Name

Doctor of Philosophy (PhD)

First Advisor

William Kleiber

Second Advisor

Eric Vance

Third Advisor

Balaji Rajagopalan

Fourth Advisor

Jem Corcoran

Fifth Advisor

James Curry


Correlated Gaussian processes are of central importance to the study of time series, spatial statistics, computer experiments, and many machine learning models. Large spatially or temporally indexed datasets bring with them a host of computational and mathematical challenges.

Parameter estimation of these processes often relies on maximum likelihood, which for Gaussian processes involves manipulations of the covariance matrix including solving systems of equations and determinant calculations. The score function, on the other hand, avoids direct calculation of the determinant, but still requires solving a large number of linear equations. We propose an equivalent kernel approximation to the score function of a stationary Gaussian process. A nugget effect is required for the approximation. We suggest two approximations, and for large sample sizes, our proposals are fast, accurate, and compare well against existing approaches.

We then present a method for simulating time series of high frequency wind data calibrated by real data. The method provides and fits a parametric model for local wind directions by embedding them into the angular projection of a bivariate normal. Incorporating a temporal autocorrelation structure in that normal induces a continuous angular correlation over time in the simulated wind directions. The final joint model for speed and direction can be decomposed into the simulation of a single multivariate normal and a series of transformations thereof, allowing for fast and easy repeated generations of long time series. This is compared to a state of the art approach for simulating angular time series of swapping between discrete regimes of wind direction, a method that does not fully translate to high frequency data.