Hey there! Ready to crunch some numbers? Let’s dive into the world of statistics! Feel Free to reach out any time Contact Us

Maximum likelihood estimation for Normal Distribution

Maximum Likelihood Estimator for Normal Distribution, Normal Distribution, Maximum Likelihood estimator, MLE

 The pdf of Normal distribution is given by:

`f(x) = \frac{1}{\sigma \sqrt{2\pi}} e^{-\frac{(x - \mu)^2}{2\sigma^2}}`

Case 1: MLE for `\mu` when `\sigma^2` is known.



The likelihood function is given by

`f(x; \mu, \sigma^2) =\prod_{i=1}^{n} \frac{1}{\sqrt{2\pi\sigma^2}} \exp\left(-\frac{(x_{i} - \mu)^2}{2\sigma^2}\right)`

`L(\mu, \sigma^2) = \left(\frac{1}{\sqrt{2\pi \sigma^2}}\right)^n \exp\left(-\frac{1}{2\sigma^2} \sum_{i=1}^n (x_i - \mu)^2\right)`

Now taking log both sides, 

`\log L(\mu, \sigma^2) = n \log \left(\frac{1}{\sqrt{2\pi \sigma^2}}\right) - \frac{1}{2\sigma^2} \sum_{i=1}^n (x_i - \mu)^2`

`\log L(\mu, \sigma^2) = -\frac{n}{2} \log(2\pi\sigma^2)  - \frac{1}{2\sigma^2} \sum_{i=1}^n (x_i - \mu)^2`

Now differentiate the above likelihood equation partially with respect to `\mu` and further solving this calculus 

`\frac{\partial}{\partial \mu} \log L(\mu, \sigma^2) = \frac{\partial}{\partial \mu} \left(-\frac{1}{2\sigma^2} \sum_{i=1}^n (x_i - \mu)^2\right)`

`\frac{\partial}{\partial \mu} \log L(\mu, \sigma^2) = -\frac{1}{2\sigma^2} \cdot \sum_{i=1}^n \frac{\partial}{\partial \mu} (x_i - \mu)^2`

`\frac{\partial}{\partial \mu} (x_i - \mu)^2 = -2(x_i - \mu)`

`\frac{\partial}{\partial \mu} \log L(\mu, \sigma^2) = -\frac{1}{2\sigma^2} \cdot \sum_{i=1}^n (-2)(x_i - \mu)`

`\frac{\partial}{\partial \mu} \log L(\mu, \sigma^2) = \frac{1}{\sigma^2} \sum_{i=1}^n (x_i - \mu)`

`\frac{\partial}{\partial \mu} \log L(\mu, \sigma^2) = \frac{1}{\sigma^2} \left(\sum_{i=1}^n x_i - n\mu\right)`

Now putting above equation equal to zero to get the MLE of `\mu`

`\frac{1}{\sigma^2} \left(\sum_{i=1}^n x_i - n\mu\right) = 0`

`\sum_{i=1}^n x_i - n\mu = 0`

`n\mu = \sum_{i=1}^n x_i`

`\hat{\mu}_{MLE} = \frac{1}{n} \sum_{i=1}^n x_i=\bar X`

So this is required Maximum Likelihood estimator for `X~N(\mu, \sigma^2)`

The Restricted MLE in Normal distribution for suppose `\mu>\mu_{°}` is given by

Normal Distribution Curve with Different Shaded Regions

Normal Distribution Curve with Constant and Different Shaded Regions

Python code for Normal Distribution Normal Distribution Code Block

# Import necessary libraries
import numpy as np
import matplotlib.pyplot as plt
from scipy.stats import norm

# Parameters of the normal distribution
mu = 0  # Mean
sigma = 1  # Standard deviation

# Generate x values
x = np.linspace(-6, 6, 1000)

# Normal PDF values
y = norm.pdf(x, mu, sigma)

# Calculate the probability for -5 < X < 5
lower_bound, upper_bound = -5, 5
probability = norm.cdf(upper_bound, mu, sigma) - norm.cdf(lower_bound, mu, sigma)
print(f"P({lower_bound} < X < {upper_bound}) = {probability:.4f}")

# Highlight the area under the curve for -5 < X < 5
x_fill = np.linspace(lower_bound, upper_bound, 1000)
y_fill = norm.pdf(x_fill, mu, sigma)

# Plot the normal distribution curve
plt.plot(x, y, label='Normal Distribution', color='blue')
plt.fill_between(x_fill, 0, y_fill, color='orange', alpha=0.5, label=f"Area = {probability:.4f}")

# Add labels and title
plt.title('Normal Distribution Curve with P(-5 < X < 5)')
plt.xlabel('X')
plt.ylabel('Density')
plt.legend()
plt.grid(True)

# Show the plot
plt.show()

        

Case 2 : MLE for `sigma^2` when `mu` is known 

`f(x; \mu, \sigma^2) =\prod_{i=1}^{n} \frac{1}{\sqrt{2\pi\sigma^2}} \exp\left(-\frac{(x_{i} - \mu)^2}{2\sigma^2}\right)`

`L(\mu, \sigma^2) = \left(\frac{1}{\sqrt{2\pi \sigma^2}}\right)^n \exp\left(-\frac{1}{2\sigma^2} \sum_{i=1}^n (x_i - \mu)^2\right)`

Now taking log both sides, 

`\log L(\mu, \sigma^2) = n \log \left(\frac{1}{\sqrt{2\pi \sigma^2}}\right) - \frac{1}{2\sigma^2} \sum_{i=1}^n (x_i - \mu)^2`

`\log L(\mu, \sigma^2) = -\frac{n}{2} \log(2\pi\sigma^2) - \frac{1}{2\sigma^2} \sum_{i=1}^n (x_i - \mu)^2`

Now differentiate the above likelihood equation partially with respect to `sigma^2` and further solving this we get 

`\frac{\partial}{\partial \sigma^2} \log L(\mu, \sigma^2) = -\frac{n}{2\sigma^2} + \frac{1}{2\sigma^4} \sum_{i=1}^n (x_i - \mu)^2`

`\frac{\partial}{\partial \sigma^2} \log L(\mu, \sigma^2) = -\frac{n}{2\sigma^2} + \frac{1}{2\sigma^4} \sum_{i=1}^n (x_i - \mu)^2`

`-\frac{n}{2\sigma^2} + \frac{1}{2\sigma^4} \sum_{i=1}^n (x_i - \mu)^2 = 0`

`-n\sigma^2 + \sum_{i=1}^n (x_i - \mu)^2 = 0`

`n\sigma^2 = \sum_{i=1}^n (x_i - \mu)^2`

`\sigma^2 = \frac{1}{n} \sum_{i=1}^n (x_i - \mu)^2`

The likelihood Estimator of `sigma^2` is given by when `mu` is known 

`hat \sigma_{MLE}^2 = \frac{1}{n} \sum_{i=1}^n (x_i - \mu)^2`

And likelihood Estimator of `sigma^2` is given by when `mu` is unknown

`hat \sigma_{MLE}^2 = \frac{1}{n} \sum_{i=1}^n (x_i - bar X)^2`



Post a Comment

Oops!
It seems there is something wrong with your internet connection. Please connect to the internet and start browsing again.
Site is Blocked
Sorry! This site is not available in your country.