×

The foundations of finite sample estimation in stochastic processes. (English) Zbl 0584.62135

Let \(y_ 1,...,y_ n\) be a discrete stochastic process with distribution F belonging to some class \({\mathcal F}\). The paper considers the estimation of a parameter \(\theta =\theta (F)\) by solving the equation \(g(y_ 1,...,y_ n,\theta)=0\) where \[ g=\sum^{n}_{i=1}a_{i-1}(y_ 1,...,y_{i-1},\theta)h_ i(y_ 1,...,y_ i,\theta)\quad and\quad h_ i\quad satisfies\quad \]
\[ E_ F(h_ i(y_ 1,...,y_ i,\theta (F))| y_ 1,...,y_{i-1})=0\quad for\quad all\quad F\in {\mathcal F}. \] The optimal choice of the \(a_ i's\) is determined such that \(E(g^ 2)/E(\partial g/\partial \theta)^ 2\) is minimized. This result can be viewed as a generalisation of the Gauss- Markov theorem.
Applications include linear and nonlinear autoregressions and branching processes. The extension to the multiparameter case is mentioned.
The author considers only a finite sample size, but in fact he introduces a different kind of asymptotics since the criterion for optimality is the asymptotic variance when one takes infinitely many i.i.d. copies of the sample. There are close connections with robust estimation for time series, for instance conditional unbiasedness plays a key role also in the reviewer’s paper ”Infinitesimal robustness for autoregressive processes.” Ann. Stat. 12, 843-863 (1984).
The main difference is that the present paper requests unbiasedness for a very large nonparametric class \({\mathcal F}\) which determines the functions \(h_ i\) and thus leads to a small class of estimators considered.
Reviewer: H.-R.Künsch

MSC:

62M05 Markov processes: estimation; hidden Markov models
62M10 Time series, auto-correlation, regression, etc. in statistics (GARCH)
62M09 Non-Markovian processes: estimation
62J05 Linear regression; mixed models
Full Text: DOI