## Exercise 1 (Poisson Fisher Information)
Let $X_1, X_2, \ldots, X_n \sim \text{Poisson}(\lambda)$. Find the method of moments estimator of $\lambda$. Find the maximum likelihood estimator of $\lambda$. Find the Fisher information $I(\lambda)$.
## Exercise 2 (Fisher Information Matrix)
Let $X_1, X_2, \ldots, X_n \sim \text{N}(\mu, \sigma ^ 2)$. Find $I_n(\mu, \sigma)$.
## Exercise 3 (Exponential MLE)
Let $X_1, X_2, \ldots, X_n \sim \text{Exponential}(\lambda)$. That is
$$
f(x) = \lambda e^{- \lambda x}.
$$
Use the MLE and its standard error to derive an expression for an approximate 95% confidence interval for $\lambda$.
## Exercise 4 (Exponential MLE, Continued)
Define $\phi = \log(\lambda)$. Use the MLE and its standard error to derive an expression for an approximate 95% confidence interval for $\phi$.
```{r}
set.seed(42)
exp_data = rexp(n = 100, rate = 0.5)
```
Using the data stored in `exp_data`, calculate an approximate 95% confidence interval for $\lambda$ two ways:
- Using the interval from Exercise 3.
- Using the interval from this exercise, transformed back to the $\lambda$ scale.
## Exercise 5 (Another MLE)
Let $X_1, X_2, \ldots, X_n \sim \text{N}(\theta, 1)$. Define
$$
Y_i =
\begin{cases}
1 & \text{if } X_i > 0 \\
0 & \text{if } X_i \leq 0.
\end{cases}
$$
Let $\phi = \mathbb{P}(Y_1 = 1)$.
Use the MLE, $\hat{\phi}$, and its standard error to derive an expression for an approximate 95% confidence interval for $\phi$.
## Exercise 6 (Asymptotic Relative Efficiency)
Continuing the setup from Exercise 5, now define
$$
\tilde{\phi} = \frac{1}{n} \sum_{i = 1}^{n} Y_i.
$$
Find the asymptotic relative efficiency of $\tilde{\phi}$ to $\hat{\phi}$. Your answer will be a function of $\theta$. Provide the value of of $\theta$ and the associated asymptotic relative efficiency for the value of $\theta$ that gives the largest asymptotic relative efficiency.
## Exercise 7 (Comparing Two Groups)
Suppose $n_1$ people are given treatment 1 and $n_2$ people are given treatment 2. Let $X_1$ be the number of people on treatment 1 who respond favorably to the treatment and let $X_2$ be the number of people on treatment 2 who respond favorably.
Assume $X_1 \sim \text{Binomial}(n_1, p_1)$ and $X_2 \sim \text{Binomial}(n_2, p_2)$.
Let $\phi = p_1 - p_2$.
Use the MLE, $\hat{\phi}$, and its standard error to derive an expression for an approximate 90% confidence interval for $\phi$. To arrive at the standard error, first find $I(p_1, p_2)$ and then apply the delta method.
## Exercise 8 (Comparing Standard Errors)
Continue with the setup from Exercise 7. Given:
- $n_1 = n_2 = 200$
- $X_1 = 160$
- $X_2 = 148$
Compare 90% confidence interval for $\phi$ using standard errors from Exercise 7 and the parametric bootstrap.
## Exercise 9 (Geometric MLE)
Let $X_1, X_2, \ldots, X_n \sim \text{Geometric}(\pi)$.
Use the MLE, $\hat{\pi}$, and its standard error to derive an expression for an approximate 95% confidence interval for $\pi$.
## Exercise 10 (Geometric MLE, Continued)
Define $\psi = \text{logit}(\pi)$. Use the MLE and its standard error to derive an expression for an approximate 95% confidence interval for $\psi$.
```{r}
set.seed(42)
geom_data = rgeom(n = 100, prob = 0.2)
```
Using the data stored in `geom_data`, calculate an approximate 95% confidence interval for $\pi$ two ways:
- Using the interval from Exercise 9.
- Using the interval from this exercise, transformed back to the $\pi$ scale.
```{r}
# shift data to match parameterization use in previous problem
geom_data = geom_data + 1
```
## Exercise 11 (Rao-Blackwellization)
Let $X_1, X_2, \ldots, X_n \sim \text{Poisson}(\lambda)$. Show that $\sum_{i = 1}^{n} X_i$ is a sufficient statistic for $\lambda$. Consider two estimators:
1. $\hat{\lambda}_1 = X_1$
2. $\hat{\lambda}_2$ which is the results of applying Rao-Blackwell to $\hat{\lambda}_1 = X_1$ with $\sum_{i = 1}^{n} X_i$.
Show that $\hat{\lambda}_2$ has a smaller MSE than $\hat{\lambda}_1$.