Focused Bayesian Prediction
Bayesian predictive distributions quantify uncertainty about out-of-sample values of a random process conditioned only on observed data, with uncertainty about model-specific parameters integrated out via the usual probability calculus. Uncertainty about the assumed model itself is, in turn, accommodated via model-averaging, on the implicit assumption that the true data generating process (DGP) is contained in the set over which the averaging occurs. Herein, we propose a new method for conducting Bayesian prediction that does not require the true DGP to be either known with certainty, or known to lie in a finite set of models. Instead, prediction is driven by a user-supplied measure of predictive accuracy. A prior class of plausible predictive models is updated to a posterior over these models, via a criterion function that captures the desired measure of accuracy. Subject to regularity, this update is shown to yield posterior concentration onto the element of the predictive class that maximizes the expectation of the relevant measure.
We illustrate the potential of the method in a series of simulation experiments based on classes of misspecified predictives for a stochastic volatility process. We find substantial improvement in out-of-sample predictive performance over the conventional likelihood-based update, and use animated graphics to demonstrate why such gains are achieved. Improved forecast accuracy is also a feature of a series of empirical analyses. Most notably, we illustrate that explicit use of forecast accuracy in the Bayesian update dominates top competitors in the recent M4 forecasting competition, based on 23000 distinct time series. All such results augur well for the potential of the new paradigm to reap benefits more broadly.
Joint work with Ruben Loaiza-Maya and David T. Frazier.
Short Bio:
Gael Martin is a Professor of Econometrics in the Department of Econometrics and Business Statistics at Monash University, Australia, and was an Australian Research Council Future Fellow from 2010 to 2013. Her primary research interests have been in the development of simulation-based inferential and forecasting methods for complex dynamic models in economics and finance. Time series models for long memory, non-Gaussian - including discrete count - data have been a particular focus, with state space representations being central to much of that work. The development of Bayesian simulation-based methods has been a very important part of her research output, including work on the newer computational methods such as approximate Bayesian computation. However frequentist methods based on the bootstrap, including their theoretical validation, have also featured in her research. Her interests centre not only on methods of inference, but on the impact of inferential technique on probabilistic forecasting, and the accuracy thereof. Mis-specification of the forecast model has been a particular focus of late. She is currently an Associate Editor of Journal of Applied Econometrics, International Journal of Forecasting (IJF) and Econometrics and Statistics, and was a guest editor for a special issue of IJF on Bayesian Forecasting in Economics.
Her personal webpage, which includes all published work and some other current projects, is at: http://users.monash.edu.au/~gmartin/
About Statistics, modelling and operations research seminars
Students, staff and visitors to UQ are welcome to attend our regular seminars.
The events are jointly run by our Operations research and Statistics and probability research groups.
The Statistics, modelling and operations research (SMOR) Seminar series seeks to celebrate and disseminate research and developments across the broad spectrum of quantitative sciences. The SMOR series provides a platform for communication of both theoretical and practical developments, as well as interdisciplinary topics relating to applied mathematics and statistics.