We propose an observation-driven modeling framework that allows model parameters to vary over time through an implicit score-driven (ISD) update. The ISD update maximizes the logarithmic observation density with respect to the parameter vector while penalizing the weighted L2 norm relative to a one-step-ahead predicted parameter. This yields an implicit stochastic-gradient update. We show that the popular class of explicit score-driven (ESD) models arises when the observation log density is linearly approximated around the prediction. By preserving the full density, the ISD update extends the favorable local properties of the ESD update to a global setting. For log-concave observation densities, whether correctly specified or not, the ISD filter is stable for all learning rates, and its updates are contractive in mean squared error toward the (pseudo-)true parameter at every time step. We demonstrate the usefulness of ISD filters in simulations and empirical applications in finance and macroeconomics.
翻译:暂无翻译