Mean square

In mathematics and its applications, the mean square is defined as the arithmetic mean of the squares of a set of numbers or of a random variable,[1] or as the arithmetic mean of the squares of the differences between a set of numbers and a given "origin" that may not be zero (e.g. may be a mean or an assumed mean of the data).[2]

When the mean square is calculated relative to a given "target" or "correct" value, or as the mean square of differences from a sequence of correct values, it is known as mean squared error.

A typical estimate for the variance from a set of sample values uses a divisor of one less than the number of values, rather than a simple arithmetic average, and this is still called the mean square (e.g. in analysis of variance):

The second moment of a random variable, is also called the mean square.

The square root of a mean square is known as the root mean square (RMS or rms), and can be used as an estimate of the standard deviation of a random variable.

References

  1. "Noise and Noise Rejection" (PDF). engineering.purdue.edu/ME365/Textbook/chapter11. Retrieved 6 January 2020.
  2. "OECD Glossary of Statistical Terms". oecd.org. Retrieved 6 January 2020.
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.