### Project UT-B: Stochastic Modelling, Estimation and Decision Making

**Manager:** Richard Vinter

**Investigator:** Richard Vinter

**Research Staff:** Nikos Kantas (Research Associate)

**Collaborators:** Martin Clark

**Start date:** 01/06/2010

**Linked Projects:** PS-B, PS-C, EET-B and EET-C

**Summary.** Probabilistic analysis is a powerful tool for decision-making under uncertainty, which avoids the overly cautious strategies resulting from a deterministic (‘worst case’) approach that takes account of all eventualities, even those that are unlikely to occur. The aim of this project is to develop techniques of estimation and decision making based on probabilistic models, which have special relevance to research areas in electrical power generation and fuel efficient transport addressed elsewhere in Projects PS-B, PS-C and EET-B. These include techniques for calculating the probabilities of rare ‘catastrophic’ events and system design methodologies ensuring that these probabilities lie below some very small, acceptable threshold. Examples of such events include collisions and near misses of aircraft in a crowded sky and blackouts of electrical power systems due to component failure or gross disturbances. The project also covers research into stochastic decision-making techniques for infrastructure provision (what is the appropriate level of conventional power source backup required to counterbalance the intermittency of the wind power supply?) and infrastructure utilization (when should the back-up supply be brought on line?)

**Current Status.** Research in this project has followed several directions:
One theme is the calculation of probabilities of rare events, to support work in the consortium research programme on electric power risk profiling (project PS-B). Analytic tools are being developed which provide approximations of such quantities as the probability of a random signal exceeding a threshold, the expected frequency with which the threshold is crossed and the expected length of time of violations of the threshold. When the signal is interpreted as the power system load loss, these probabilities have an important role as performance indicators in the assessment of power system reliability. A ‘small noise’ asymptotic analysis, based on probabilistic models from the power engineering literature, is used to derive formulae for these probabilities that are highly accurate when the probabilities are very small. Comparative studies are being undertaken between this approach and the estimation of the small probabilities involved using Monte Carlo techniques. Bearing in mind the inherent inefficiency of Monte Carlo simulation techniques to approximate low probability events, even variance-reduction Monte Carlo schemes based a change of sampling distributions, we see that an asymptotic analysis has the potential for substantial saving in computational effort, when it is applicable.

Another theme is the improvement of particle methods; that is, techniques for approximating probability distributions associated with stochastic dynamical systems, based on Monte Carlo simulations. They are now widely used in signal processing because they place practically no restrictions on the nature of the stochastic models used to describe the underlying stochastic variables, and the high computational demands of particle methods is no longer the obstacle to implementation it once was, owing to algorithm refinement and advances in computer hardware. Research in this project has been directed in part at improvements to generic particle methods for the approximation of probability distribution arising in filtering and estimation, particularly related to capturing probabilistic information about rare events, developing analytical tools to assess the quality of approximations provided by particle methods, and also on attempting to harness the benefits of particle methods, not just for signal processing, but in the areas of parameter estimation and stochastic decision making. Work has also been undertaken into the ‘duality’ between stochastic optimal control and nonlinear filtering. The duality is being exploited to reformulate stochastic control problems as filtering problems, thereby opening up stochastic control to the application of particle methods. In a case study, these methods are being applied to the problem of designing a controller for a two-degree-of-freedom stochastic mechanical system, to ensure that the displacements of one the masses do not exceed a given threshold, formulated as a stochastic optimal control problem. The stochastic control problem is not amenable to standard solution techniques, such as Dynamic Programming, because of its high dimensionality and the presence of non-linearities.

**Publications:**

**[KKV14]**
N. Kantas, P. Kountouriotis, R.B. Vinter,
*Application of Nonlinear Filtering Techniques to Stochastic Optimal Control*,
Automatica,
submitted

**[CV12]**
J. M. C. Clark and R. B. Vinter,
*Stochastic Exit Time Problems Arising in Process Control*,
Stochastics, 84, 5-6, 2012, pp. 667-681.,
2012

**[WKJ11]**
N.Whiteley, N. Kantas and A.Jasra,
*Linear variance bounds for particle approximations of time-homogenous Feynman-Kac formulae*,
Stochastic Processes and Applications,
Volume 122, Issue 4, pp 1840-1865, April 2012

**[KSD12]**
N. Kantas, S.S. Singh, A. Doucet,
_Distributed maximum likelihood with applications to simuultaneous self-localization and tracking for sensor networks _,
IEEE Transactions on Signal Processing, Vol 60, Issue 10, pp 5038-5047 ,
2012

**[CKV10]**
J. M. C. Clark, P. A. Kountouriotis and R. B. Vinter,
*A Gaussian mixture filter for range-only tracking*,
IEEE Transactions on Automatic Control,
vol.56, no.3, pp.602-613, March 2011