Using the method of moments to estimate parameters
The method of moments associates moments with the estimand. What is a moment?
A moment is a special statistic of a distribution. The most commonly used moment is the nth moment of a real-valued continuous function. Let's use M to denote the moment, and it is defined as follows, where the order of the moment is reflected as the value of the exponent:
This is said to be the moment about the value c. Often, we set c to be 0:
Some results are immediately available – for example, because the integration of a valid Probability Density Function (PDF) always gives 1. Therefore, we have M0 = 1.
Also, M1 is the expectation value, therefore the mean.
A note on central moments
For high-order moments where c is often set to be the mean, these moments are called central moments. In this setting, the second moment, M2, becomes the variance.
Let's understand how these moments are used to estimate parameters...