Abstract
We investigate information-theoretic optimality properties of the score function of the predictive likelihood as a device for updating a real-valued time-varying parameter in a univariate observation-driven model with continuous responses.We restrict our attention to models with updates of one lag order. The results provide theoretical justification for a class of score-driven models which includes the generalized autoregressive conditional heteroskedasticity model as a special case. Ourmain contribution is to show that only parameter updates based on the score will always reduce the local Kullback-Leibler divergence between the true conditional density and themodel-implied conditional density. This result holds irrespective of the severity of modelmisspecification. We also show that use of the score leads to a considerably smaller global Kullback- Leibler divergence in empirically relevant settings. We illustrate the theory with an application to time-varying volatility models. We show that the reduction in Kullback-Leibler divergence across a range of different settings can be substantial compared to updates based on, for example, squared lagged observations.
Original language | English |
---|---|
Pages (from-to) | 325-343 |
Journal | Biometrika |
Volume | 102 |
Issue number | 2 |
DOIs | |
Publication status | Published - 2015 |