Let $X_1, X_2, \dots, X_n$ be a random sample from $U(0, \theta)$. We examine the application of the Cramér-Rao Lower Bound (CRLB) theorem for the estimation of $\theta$.
The support of $U(0, \theta)$, which is $S(\theta) = \{ x \mid 0 < x < \theta \}$, depends on the parameter $\theta$. This violates the regularity conditions (ii) and (iv), meaning the CRLB theorem does not provide meaningful results.
The probability density function is:
Taking the logarithm:
Computing the derivative:
Thus, Fisher information is:
Let $T$ be an unbiased estimator of $\theta$. The CRLB is given by:
We now consider an unbiased estimator of $\theta$:
Its variance is:
This contradicts the claim that CRLB($\theta$) is the lower bound on the variance of unbiased estimators of $\theta$.
We also consider:
where $X_{(m)} = \max(X_1, X_2, \dots, X_n)$ is the complete-sufficient statistic for $\theta$.
Its expected value is:
and its variance is:
Since we have found unbiased estimators with variance smaller than the CRLB, the Cramér-Rao inequality does not hold in this case.