Point estimation
In statistics, point estimation involves the use of sample data to calculate a single value (known as a point estimate since it identifies a point in some parameter space) which is to serve as a "best guess" or "best estimate" of an unknown population parameter (for example, the population mean). More formally, it is the application of a point estimator to the data to obtain a point estimate.
Point estimation can be contrasted with interval estimation: such interval estimates are typically either confidence intervals, in the case of frequentist inference, or credible intervals, in the case of Bayesian inference.
Point estimators
There are a variety of point estimators, each with different properties.
- minimum-variance mean-unbiased estimator (MVUE), minimizes the risk (expected loss) of the squared-error loss-function.
- best linear unbiased estimator (BLUE)
- minimum mean squared error (MMSE)
- median-unbiased estimator, minimizes the risk of the absolute-error loss function
- maximum likelihood estimator (MLE)
- method of moments and generalized method of moments
Bayesian point estimation
Bayesian inference is typically based on the posterior distribution. Many Bayesian point estimators are the posterior distribution's statistics of central tendency, e.g., its mean, median, or mode:
- Posterior mean, which minimizes the (posterior) risk (expected loss) for a squared-error loss function; in Bayesian estimation, the risk is defined in terms of the posterior distribution, as observed by Gauss.[1]
- Posterior median, which minimizes the posterior risk for the absolute-value loss function, as observed by Laplace.[1][2]
- maximum a posteriori (MAP), which finds a maximum of the posterior distribution; for a uniform prior probability, the MAP estimator coincides with the maximum-likelihood estimator;
The MAP estimator has good asymptotic properties, even for many difficult problems, on which the maximum-likelihood estimator has difficulties. For regular problems, where the maximum-likelihood estimator is consistent, the maximum-likelihood estimator ultimately agrees with the MAP estimator.[3][4][5] Bayesian estimators are admissible, by Wald's theorem.[4][6]
The Minimum Message Length (MML) point estimator is based in Bayesian information theory and is not so directly related to the posterior distribution.
Special cases of Bayesian filters are important:
Several methods of computational statistics have close connections with Bayesian analysis:
Properties of point estimates
Notes
- Dodge, Yadolah, ed. (1987). Statistical data analysis based on the L1-norm and related methods: Papers from the First International Conference held at Neuchâtel, August 31–September 4, 1987. North-Holland Publishing.
- Jaynes, E. T. (2007). Probability Theory: The logic of science (5. print. ed.). Cambridge University Press. p. 172. ISBN 978-0-521-59271-0.
- Ferguson, Thomas S. (1996). A Course in Large Sample Theory. Chapman & Hall. ISBN 0-412-04371-8.
- Le Cam, Lucien (1986). Asymptotic Methods in Statistical Decision Theory. Springer-Verlag. ISBN 0-387-96307-3.
- Ferguson, Thomas S. (1982). "An inconsistent maximum likelihood estimate". Journal of the American Statistical Association. 77 (380): 831–834. doi:10.1080/01621459.1982.10477894. JSTOR 2287314.
- Lehmann, E. L.; Casella, G. (1998). Theory of Point Estimation (2nd ed.). Springer. ISBN 0-387-98502-6.
Bibliography
- Bickel, Peter J. & Doksum, Kjell A. (2001). Mathematical Statistics: Basic and Selected Topics. I (Second (updated printing 2007) ed.). Pearson Prentice-Hall.
- Liese, Friedrich & Miescke, Klaus-J. (2008). Statistical Decision Theory: Estimation, Testing, and Selection. Springer.