Pfeiffer - The intuition, however, is straightforward. Since the expected value is a linear operator and differentiation is a linear operation, under appropriate conditions we can differentiate through the expected value: Making the substitution , we obtain. Example In the previous example we have demonstrated that the mgf of an exponential random variable is The expected value of can be computed by taking the first derivative of the mgf: and evaluating it at : The second moment of can be computed by taking the second derivative of the mgf: and evaluating it at : And so on for higher moments.
Proposition Let and be two random variables. Denote by and their distribution functions and by and their mgfs. For a fully general proof of this proposition see, for example, Feller We just give an informal proof for the special case in which and are discrete random variables taking only finitely many values. The "only if" part is trivial. If and have the same distribution, then The "if" part is proved as follows. Denote by and the supports of and and by and their probability mass functions.
Denote by the union of the two supports: and by the elements of. The mgf of can be written as By the same token, the mgf of can be written as: If and have the same mgf, then for any belonging to a closed neighborhood of zero and Rearranging terms, we obtain This can be true for any belonging to a closed neighborhood of zero only if for every. It follows that that the probability mass functions of and are equal.
As a consequence, also their distribution functions are equal. This proposition is extremely important and relevant from a practical viewpoint: in many cases where we need to prove that two distributions are equal, it is much easier to prove equality of the moment generating functions than to prove equality of the distribution functions.
Also note that equality of the distribution functions can be replaced in the proposition above by:. Let be a random variable possessing a mgf. Define where are two constants and.
Then, the random variable possesses a mgf and. Doing so, we get:. Not only can a moment-generating function be used to find moments of a random variable, it can also be used to identify which probability mass function a random variable follows.
A moment-generating function uniquely determines the probability distribution of a random variable. This implies necessarily that if two random variables have the same moment-generating function, then they must have the same probability distribution.
Therefore, the p. You can find the mgfs by using the definition of expectation of function of a random variable. Besides helping to find moments, the moment generating function has an important property often called the uniqueness property. The uniqueness property means that, if the mgf exists for a random variable, then there one and only one distribution associated with that mgf.
Cross Validated is a question and answer site for people interested in statistics, machine learning, data analysis, data mining, and data visualization. It only takes a minute to sign up. Connect and share knowledge within a single location that is structured and easy to search. In most basic probability theory courses your told moment generating functions m. In particular the expectation and variance.
Now in most courses the examples they provide for expectation and variance can be solved analytically using the definitions. Are there any real life examples of distributions where finding the expectation and variance is hard to do analytically and so the use of m. I'm asking because I feel like I don't get to see exactly why they are important in the basic courses. You are right that mgf's can seem somewhat unmotivated in introductory courses.
So, some examples of use. First, in discrete probability problems often we use the probability generating function, but that is only a different packaging of the mgf, see What is the difference between moment generating function and probability generating function?
Another kind of use is constructing approximations of probability distributions, one example is the saddlepoint approximation, which take as starting point the natural logarithms of the mgf, called the cumulant generating function. See How does saddlepoint approximation work? Mgf's can also be used to prove limit theorems, for instance the poisson limit of binomial distributions Intuitively understand why the Poisson distribution is the limiting case of the binomial distribution can be proved via mgf's.
The actuaries seem to be using mgf's to solve some problems that arises for instances in premium calculations that is difficult to solve otherwise.
One example in section 3. One source of estimated mgf's for such applications could be empirical mgf's strangely, I cannot find even one post here about empirical moment generating functions.
0コメント