Hey there! Ready to crunch some numbers? Let’s dive into the world of statistics! Feel Free to reach out any time Contact Us

Power Series Distribution and it's applications

Power Series Distribution it's Applications in finding the UMVUE, minimum variance unbiased estimator, sufficient statistics, conditional expectations

Power Series Distribution


Power series distributions are the discrete type of distributions and we can say it they are family of some other distributions like Poisson Distribution, Geometric Distribution, Negative Binomial Distribution.

Suppose a = $(a_1, a_2, a_3,...........a_n)$ is a sequence of non negative real numbers then power series coefficient is given by

`f_(\theta)(x)=\sum_{x=1}^{n} a_{x}\theta^{x}`

the Power Series is defined by $\lim_{n\arrow \infty) f_{\theta}(x)$ and is denoted by 

`f(x) =\sum_{x=1}^{\infty} a_{x}\theta^{x}

this power is with center 0 and its radius of convergence is $|x|< R$ where is $ R = \left(   \frac{1}{\lim |\frac{a_{n+1}}{a_n}|} \right)$ if radius of convergence is $\infty$ then the series is convergent for all real values of $x$,.

Its Distribution 

if we restrict the $x \in [0, r)$, this will be our parameter space.

then its distribution is given by

Power series distribution) [Roy and Mitra (1957)] Let $X_1, X_2, ..., X_n$ be a random sample from a power series distribution with pmf 
 `P_\theta\{X = x\} = \dfrac{a(x)\theta^x}{c(\theta)} ; \quad x = 0, 1, 2, \dots`
 where $\theta > 0$, $a(x) > 0$, and $c(\theta)$ is given by `c(\theta) = \sum_{x=0}^{\infty} a(x)\theta^x` Show that $T = \sum_{i=1}^{n} X_i$ is a complete sufficient statistic for `\theta` and UMVUE of $\theta^r$, where $r > 0$ is an integer, is

`f(x) = \begin{cases} 0, & \text{if } x < r \\ \frac{A(t-r,n)}{A(t,n)}, & \text{if } x \ge r \end{cases}`

Where $A(t,n)$ is a coefficient of $\theta^{t}$ in the expansion of $c[(\theta)]^n$ and 

`A(t,n)=\sum_{x_{1}, x_{2}, x_{3}....x_{n}}\left(\prod_{i=1}^{n} a(x_{i})\right)` 

And show that $\sum_{i=1}^{n} X_{i} = t$ and find the umvue of $P_{\theta} (X=x)$.

The distribution of $T=\sum_{i=1}^{n}X_{i}$ is given by 

`P_{\theta}(T=t)=\sum_{(x\in \mathbb{R^1}:T(X)=t)} \frac{\prod_{i=n}^{n} a(x_i)}{[c(\theta)]^n} \theta^t` 

Here $\sum_{(x\in \mathbb{R^1}:T(X)=t)} \prod_{i=n}^{n} a(x_i)$ is the coefficient of of $\theta^t$ in the expansion of $[c(\theta)]^n] = [\sum_{i=1}^{n} a(x)\theta^{x}]^{n}$. We denote this as $A(t,n)$. Thus the distribution of $T$ is given by 

`P_\theta{T=t} = \frac{A(t,n)\theta^t}{[c(\theta)]^n}, t=01,2,.....`

The distribution of $T$ belongs to one ,parameter exponential family, where $T$ is a complete sufficient statistics. The UMVUE of $\theta^r$ is given by 

`E[(\delta)]=\theta^r`

`E[(\delta)]\frac{A(t,n)\theta^t}{[c(\theta)]^n} = \theta^r`

`E[(\delta)]A(t,n)\theta^t = [c(\theta)]^{n}\theta^r`

`\sum_{t=0}^{\infty} A(t,n) \theta^{t+r} = \sum_{y=0}^{\infty} A(y-r,n)\theta^y`

`=\sum_{y=0}^{r-1} 0.\theta^{y} + \sum_{r=0}^{\infty} A(y-r,n) \theta^{y}`

as $[c(\theta)]^n = \sum_{i} A(t,n)\theta^{n}$ on comparing with coefficient of  $\theta^t$ on both sides we have 

`\boxed{\delta(t) = \begin{cases} 0, & \text{if } t=0,1,2....,r \\ \frac{A(t-r,n)}{A(t,n)}, & \text{if } t \ge r \end{cases}}`

The $\delta(t)$ is unbiased for $\theta^r$

Let consider the problem of estimating the pmf of $P(X=x)$ on the basis of sample observations 

The estimator 

`U(X) = \begin{cases} 1 ; & \text{if } X_1 =x \\ 0, & \text{if } otherwise \end{cases}`

is an unbiased estimator of $P_{\theta}(X=x)$ since $E[(U(X_{1})]=P_{\theta}(X=x)=[a(x)\theta^x]/[c(\theta)].$

Since $T=\sum_i^n X_i$ is a complete sufficient statistic for $\theta$, the UMVUE of $P_{\theta}(X=x)$ is 

`E(U(X)|T=t)=\frac{P\left(X_1 =x, \sum_{i=1}^{n} X_i =t \right)}{P\left(\sum_{i=1}^{n} X_i =t \right)}`

`E(U(X)|T=t)=\frac{P(X_1 =x)P\left(\sum_{i=2}^{n} X_i =t-x \right)}{P\left(\sum_{i=1}^{n} X_i =t \right)}`

`E(U(X)|T=t)=\frac{ \frac{a(x)\theta^{x}}{c(\theta)} P\left(\sum_{i=2}^{n} X_i =t-x \right)}{P\left(\sum_{i=1}^{n} X_i =t \right)}`

`= \frac{\frac{a(x)\theta^{x}}{c(\theta)}\frac{A(t-x,n-1)}{[c(\theta)]^n}\theta^{t-x}}{\frac{A(t,n)}{[c(\theta)]^{n}}\theta^t}`

`\boxed{\delta(t)=a(x)\frac{A(t-x,n-1)}{A(t,n)}, \quad n>1, 0\le x \le t}`

this is the UMVUE of $P(X=x)$

Applications of Power Series Distribution in finding the UMVUE 

Like from the exponential family we can extract the sufficient statistics similarly here those probability distributions belong to power series distribution, we can easily find the umvue of $\theta^{r}$

Like Poisson distribution with random variable $X\sim P(\theta)$ 

`P(X=x)= \frac{e^{-\theta}\theta^{x}}{x!} ;  x=0,1,2,......`

on comparing with power series distribution 

`P_\theta\{X = x\} = \dfrac{a(x)\theta^x}{c(\theta)} ; \quad x = 0, 1, 2, \dots`

we get $a(x) = \frac{1}{x!}$, and $[c(\theta)] = e^{\theta}$

Now calculating the coefficient of $\theta^{t}$ in the expansion of $[c(\theta)]^{n}$

`[c(\theta)]^{n} = e^{n\theta} = \sum_{t=0}^{\infty} \frac{(n\theta)^t}{t!}`

So the coefficient is also $A(t,n)$ is $\frac{n^t}{t!}$. So the UMVUE of $\theta^r$ in $P(\theta)$ is given by 

`\delta(t) = \frac{A(t-r,n)}{A(t,n)}`

`\delta(t) = \frac{n^{t-r}}{(t-r)!}\times\frac{t!}{n^t}`

In simplified form the final UMVUE is

`\delta(t) = \frac{t!}{(t-r)!}\times\frac{1}{n^r}`

Consider another application of power series distribution in finding the UMVUE of Negative Binomial Distribution.

Let $X_{1},X_{2},\dots....,X_{n}$ be identically independently distributed from $Negative \; Binomial \; Distribution$ $N(m,\theta)$ distribution. We need to find the UMVUE of 

`P_\theta(X=x)=\binom{m+x-1}{m-1}\theta^m (1-\theta)^t \quad, t=0,1,2,3,....`

The given Negative Binomial Distribution is a power series distribution $\theta^x$ as $a(x) = \binom{m+x-1}{m-1}$. 

Here $T=\sum_{i=1}^{n} X_{i}$ is sufficient and complete for $\theta$ and it $\sim N(mn, \theta)$ with density 

`P_\theta(T=t) = \binom{mn+t-1}{nm-1}\theta^{nm}(1-\theta)^t, \quad t=0,1,2,3.......`

Here $A(t, nm)=\binom{mn+t-1}{nm-1}$. Therefore by power series distribution the UMVUE of $P_\theta(X=x)$is given by 

`\delta(t) = \frac{a(x)A[t-x,(n-1)m]}{A(t,nm)}`

`\boxed{\delta(t)=\frac{\binom{m+x-1}{m_1}\binom{(n-1)m+t-x-1}{(n-1)m-1}}{\binom{nm+t-1}{nm-1}}}`

An another method of finding the the UMVUE of $P_\theta(T=t)$ 

Let consider an unbiased estimator of $P_\theta(T=t)$

`U(X) = \begin{cases} 1  & ; \text{ if $X_{1} =X$} \\ 0 & ; \text{otherwise} \\ \end{cases}`

So here $U(X)$ is unbiased and 

$E[U(X)] = \binom{m+x-1}{m_1}\theta^m (1-\theta)^x$. also $\sum X_i$ is sufficient and complete using $\textbf{Lehmann and Scheffe Theorem}$ UMVUE is given by

use

`E\left[ U(X)|T=t \right] = P\left[ X_1 =X| \sum_{i=1}^n=t\right] = \frac{P\left[ X_1 =X\right].\left[ \sum_{i=2}^n=t-x\right]}{P[\sum_{i=1}^{n}X_i=t]}`

<\div>

`= \frac{\binom{m+x-1}{m_1}\binom{(n-1)m+t-x-1}{(n-1)m-1}}{\binom{nm+t-1}{nm-1}}`

Post a Comment

Oops!
It seems there is something wrong with your internet connection. Please connect to the internet and start browsing again.
Site is Blocked
Sorry! This site is not available in your country.