If we consider a sequence of independent $Bernoulli \, Trials$ each trials has two outcomes called "success" or "failure". In each trial the probability of success is $p$ and failure is $1-p$. We observe that until a predefined number $r$ succesess occur then the random number of observed failure, $X$ follows the $Negative\, Binomial\,Distribution (Pascal)$ distribution.
`X \sim NBD(r, p)`
Let $X_{1},X_{2},X_{3},............,X_{n}$ identically independent observations $\sim NBD(r,p)$.
`P(X \,=\,x) =p(x)= \begin{cases} \binom{x+r-1}{x}p^{r}(1-p)^{x} &; x \in 0,1,2,..... \\ 0 &; otherwise \\ \end{cases}`
`P(X \,=\,x)=\binom{x+r-1}{x}p^{r}(1-p)^{x} ; x \in 0,1,2,..... \tag{1}`
<\div>`L = \prod_{i=1}^{n} p_{x_{i}}(x_{i} | r \, , p) \tag{2}`
`L = \prod_{i=1}^{n} \left[ \binom{x_{i}+r-1}{x_{i}}p^{r}(1-p)^{x_{i}} \right]`
`L = \prod_{i=1}^{n} \binom{x_{i}+r-1}{x_{i}} \prod_{i=1}^{n}(1-p)^{x_{i}} \prod_{i=1}^{n} p^{r}`
`L = \prod_{i=1}^{n} \binom{x_{i}+r-1}{x_{i}} \times (1-p)^{\sum x_{i}} \times p^{rn}`
Now taking log both sides to make calculation easy
`\log (r \, , p | \underline{x}) = \sum_{i=1}^{n} \log \binom{x_{i}+r-1}{x_{i}} + \\ \sum_{i=1}^{n}x_{i} \log (1-p) + rn \log p \tag{3}`
<\div>Now differrentiate $(3)$ partially with respect to parameter $p$.
`\frac{\partial}{\partial p}\left(\log (r \, , p | \underline{x})\right) = \sum_{i=1}^{n} x_{i} \frac{1}{(1-p)}\times (-1) + \frac{rn}{p} \tag{4}`
Putting $(4)$ equal to zero and finding the value of $p$
`\frac{\partial}{\partial p}\left(\log (r \, , p | \underline{x})\right) = 0 \Rightarrow \frac{\sum x_i}{(1-p)} = \frac{rn}{p}`
`\frac{\sum x_i}{n(1-p)} = \frac{r}{p} \Rightarrow \frac{\overline{x}}{r} = \frac{(1-p)}{p}`
`\frac{\overline{x}}{r} = \frac{1}{p} - 1 \Rightarrow \frac{\overline{x}}{r} +1 = \frac{1}{p} \tag{5}`
finaly resolving the equation $(5)$ for $p$ we get mle for this as
`\hat{p}_{MLE} = \left( \frac{r}{\overline{x}_{n} + r}\right)`