In my STAT 210A class, we frequently have to deal with the minimum of a sequence of independent, identically distributed (IID) random variables. This happens because the minimum of IID variables tends to play a large role in sufficient statistics.

In this post, I’ll briefly go over how to find the minimum of IID uniform random variables. Specifically, suppose that and let . How do we find ? To compute this, observe that

so the next step is to determine . Due to the IID assumption, it doesn’t matter which we use. Note also that to avoid adding cumbersome indicator functions, assume that .

The value is easier to compute because it directly relates to the cumulative distribution function (CDF) of a uniform random variable. For , the CDF is simply if . This means

so that by differentiating with respect to , the density function is (the notation here with the as the subscript is to denote the variable whose density is being defined).

The reason why we need the density is because of the definition of the expectation:

To compute this, we integrate by parts. Set and . We get

Combining this with the extra factor we left out (you didn’t forget that, did you?) we get . Notice that as the expected value of the minimum of these uniform random variables goes to zero. In addition, this expectation is always in for . Thus, the answer passes the smell test and seems reasonable.