The probability density function of poisson distribution is given by
`P(X = x) = \frac{\lambda^x e^{-\lambda}}{x!}, \quad x = 0, 1, 2, \ldots, \, \lambda > 0.`

For finding its likelihood estimator, we need a likelihood equation. It is given as:
`L(\lambda) = \prod_{i=1}^n P(X_i = x_i; \lambda) = \prod_{i=1}^n \left( \frac{\lambda^{x_i} e^{-\lambda}}{x_i!} \right).`
This can be rewritten as:
`L(\lambda) = \frac{\lambda^{\sum_{i=1}^n x_i} e^{-n\lambda}}{\prod_{i=1}^n x_i!}.`
Now, taking the logarithm of both sides to simplify:
`\log L(\lambda) = \log \left( \frac{\lambda^{\sum_{i=1}^n x_i} e^{-n\lambda}}{\prod_{i=1}^n x_i!} \right).`
Operating the logarithm on each term, we expand this to:
`\log L(\lambda) = \left( \sum_{i=1}^n x_i \right) \log \lambda - n\lambda - \sum_{i=1}^n \log x_i!.`
Next, differentiate the above log-likelihood function with respect to the parameter $\lambda$:
`\frac{\partial \log L(\lambda)}{\partial \lambda} = \frac{\sum_{i=1}^n x_i}{\lambda} - n.`
Setting this partial derivative equation equal to zero:
`\frac{\sum_{i=1}^n x_i}{\lambda} - n = 0.`
Thus, the maximum likelihood estimator (MLE) for $\lambda$ is:
`\hat{\lambda}_{MLE} = \frac{\sum_{i=1}^n x_i}{n} = \bar{X}.`
Consider an example of a sample data set from a Poisson distribution. We will find an estimator that maximizes the likelihood function:
Let $X_1, X_2, \dots, X_n$ be independent random variables from a Poisson distribution with parameter $\lambda$. The probability mass function (PMF) of the Poisson distribution is given by:
`P(X = x) = \frac{\lambda^x e^{-\lambda}}{x!}, \quad x = 0, 1, 2, \dots`
The likelihood function $L(\lambda)$ for a sample size $n$ is the product of these probabilities:
`L(\lambda | X_1, X_2, \dots, X_n) = \prod_{i=1}^n \frac{\lambda^{X_i} e^{-\lambda}}{X_i!}.`
Taking the logarithm of the likelihood function, we get the log-likelihood:
`\log L(\lambda) = \sum_{i=1}^n \left( X_i \log \lambda - \lambda - \log X_i! \right).`
To find the MLE, we differentiate the log-likelihood with respect to $\lambda$:
`\frac{d}{d\lambda} \log L(\lambda) = \sum_{i=1}^n \left( \frac{X_i}{\lambda} - 1 \right).`
Setting this derivative to zero to find the maximum:
`\sum_{i=1}^n \left( \frac{X_i}{\lambda} - 1 \right) = 0.`
Simplifying gives:
`\sum_{i=1}^n \frac{X_i}{\lambda} = n.`
Solving for $\lambda$, the Maximum Likelihood Estimator (MLE) is:
`\hat{\lambda} = \frac{1}{n} \sum_{i=1}^n X_i.`
Given a dataset $\{ X_i \} = 87, 67, 98, 97, 67, 98, 67, 89, 67$ and sample size $n = 10$, the sum of $X_i$ is:
`\sum X_i = 87 + 67 + 98 + 97 + 67 + 98 + 67 + 89 + 67 = 737.`
Thus, the MLE for $\lambda$ is:
`\hat{\lambda} = \frac{737}{10} = 73.7.`
Python code for poisson distribution curve
    
    
    Poisson Distribution Code
    
    
    
    
        
        
import numpy as np
import matplotlib.pyplot as plt
def poisson_distribution(lam):
    x = np.arange(0, 20, 1) 
    probabilities = np.exp(-lam) * (lam**x) / np.factorial(x)
    return x, probabilities
lam = float(input("Enter the mean (lambda): "))
x, probabilities = poisson_distribution(lam)
plt.bar(x, probabilities, color='orange', alpha=0.7)
plt.title(f"Poisson Distribution (lambda={lam})")
plt.xlabel('X')
plt.ylabel('Probability')
plt.grid(alpha=0.3)
plt.show()