Description |
Meteorology, particularly boundary layer meteorology (BLM), can be split into three different components: theoretical, experimental, and computational BLM. Experimental BLM involves deploying meteorological instruments, gathering data, and deriving models. Computational BLM involves building numerical BLM models, validating and verifying the models, and interpreting the results from the models. Both of these components have existing methodological paradigms. In experimental BLM, the standard paradigm is the deployment of a small amount of expensive sensing towers, and the alternative paradigm is the deployment of numerous inexpensive sensor stations. In computational BLM, the standard paradigm is the implementation of physics-based numerical models, and the alternative paradigm consists of statistics- and metaheuristics-based numerical models. The overarching goal of this dissertation is to explore alternative paradigms and to determine if they can benefit the field of boundary layer meteorology. First, the paradigm of low-cost mass deployed weather stations is discussed. Instead of deploying a few, large, expensive sensor towers, numerous, small, inexpensive sensor stations can be used. But, since these distributed sensor stations use inexpensive components, it is important to characterize the stations' performance to ensure it is up to research standards, and to quantify their measurement uncertainty. In addition, it is important to evaluate the sensor station design to ensure that they are sufficient for field deployment. To do this, a suite of sensor stations were manufactured, and tested in multiple field and laboratory settings. The errors were quantified and best practices documented. It was found that low-cost distributed sensor stations have slightly worse accuracies than research-grade stations, but much higher spatial coverage. It was also found that distributed sensor station errors could theoretically be corrected using artificial neural networks. Next, the paradigm of using machine learning instead of physics-based models is explored. In experimental BLM, measuring instrumentation can be "dense" or "sparse." During field experiments, the instrumentation is dense, meaning there are several stations deployed and a large amount of data is gathered. However, after the field experiment is over, many of the instruments are removed, leaving only a sparse set of (possibly permanent) instruments. Using a technique called nowcasting, it is possible to predict the environmental variables measured with a dense deployment, using only a sparse set of sensors. This research compares artificial neural networks and multiple linear regression models when performing nowcasting. Data was used from a field experiment conducted in Cadarache, France. It is found that artificial neural networks and multiple linear regression performs similarly when performing nowcasting. It is also found that both models do not have any location sensitivity to the input data. Finally, the feasibility of using unmanned aerial vehicles and metaheuristic algorithms for source localization is studied. When a contaminant is released into the atmosphere, local meteorological conditions disperse the contaminant throughout the environment. For a continuous release in a simple environment with steady-state meteorological conditions, there are multiple ways to determine the location and characteristics of the source of the contaminant. However, it becomes much more difficult to find the source of a contaminant in a non-steady-state complex environment, such as an urban environment. This research demonstrates that the particle swarm algorithm implemented on unmanned aerial vehicles is successful at finding the source of a contaminant. This is demonstrated on three different static cases, one of which includes a case modeled after the Oklahoma City Joint Urban 2003 experiment. It is also shown that this method can work on time-varying plumes. |