统计代写|EEL6029 Statistical inference

Statistics-lab™可以为您提供usf.edu EEL6029 Statistical inference统计推断课程的代写代考辅导服务!

统计代写|EEL6029 Statistical inference

EEL6029 Statistical inference课程简介

That sounds like a challenging and exciting course! Here are some key topics that might be covered in such a course:

  1. Probability theory: This would include understanding the axioms of probability and basic rules of probability, such as conditional probability and Bayes’ theorem.
  2. Statistical inference: This would cover topics such as estimation, hypothesis testing, and confidence intervals. Students would learn about point estimation, maximum likelihood estimation, and Bayesian estimation. They would also learn about hypothesis testing, including null and alternative hypotheses, p-values, and type I and type II errors.

PREREQUISITES 

  1. Linear regression: This would involve understanding the basics of simple and multiple linear regression, including the assumptions and limitations of the models. Students would learn about least squares estimation, hypothesis testing, and confidence intervals.
  2. Bayesian statistics: This would cover the principles of Bayesian inference, including Bayes’ theorem, prior and posterior distributions, and Bayesian model selection. Students would learn about Markov Chain Monte Carlo (MCMC) methods for sampling from complex posterior distributions.
  3. Machine learning: This would include an introduction to basic machine learning techniques such as decision trees, random forests, and support vector machines. Students would learn about the trade-offs between different types of models, as well as methods for model selection and evaluation.
  4. Time series analysis: This would involve understanding the basics of time series models, including autoregressive (AR), moving average (MA), and autoregressive integrated moving average (ARIMA) models. Students would learn about forecasting and time series model selection.
  5. Cyber systems modeling: This would cover modeling of complex cyber systems using mathematical tools, such as graph theory and network analysis. Students would learn about the basics of cyber security and how to model and analyze cyber attacks using statistical methods.

Overall, this course would provide a strong foundation in mathematical statistics and its applications to complex cyber systems, data analytics, and Bayesian intelligence.

EEL6029 Statistical inference HELP(EXAM HELP, ONLINE TUTOR)

问题 1.

An urn contains 11 balls numbered $0,1, \ldots, 10$. A ball is selected at random. Suppose the number on the selected ball is $k$. A second urn is filled with $k$ red balls and $10-k$ blue balls. Five balls are selected at random with replacement from the second urn.
(a) Find the probability that the sample from the second urn consists of three red and two blue balls.
(b) Given that the sample from the second urn consists of three red and two blue balls, find the conditional probability that the ball selected from the first urn had the number $k=6$.

(a) Given that ball $k$ is chosen from the first urn, the probability of choosing three red and two blue balls from the second when sampling with replacement is the binomial probability
$$
P(\text { three red|ball } k)=\left(\begin{array}{l}
5 \
3
\end{array}\right)\left(\frac{k}{10}\right)^3\left(1-\frac{k}{10}\right)^2 .
$$
The probability of choosing ball $k$ from the first urn and three red balls from the second is therefore
$$
\begin{aligned}
P(\text { three red and ball } k) & =P(\text { three red } \mid \text { ball } k) P(\text { ball } k) \
& =\left(\begin{array}{l}
5 \
3
\end{array}\right)\left(\frac{k}{10}\right)^3\left(1-\frac{k}{10}\right)^2 \times \frac{1}{11},
\end{aligned}
$$
and the unconditional probability of choosing three red balls from the second urn is
$$
\begin{aligned}
P(\text { three red }) & =\sum_{k=0}^{10} P(\text { three red and ball } k) \
& =\sum_{k=0}^{10}\left(\begin{array}{l}
5 \
3
\end{array}\right)\left(\frac{k}{10}\right)^3\left(1-\frac{k}{10}\right)^2 \times \frac{1}{11} \
& \approx 0.1515
\end{aligned}
$$
This can be computed in $\mathrm{R}$ as
$$
\begin{aligned}
& >\operatorname{sum}(\operatorname{dbinom}(3,5,(0: 10) / 10) / 11) \
& \text { [1] } 0.1515
\end{aligned}
$$
(b) The conditional probability that the chosen ball from the first urn was numbered $k=6$, given that three red balls were chosen from the second,is
$$
\begin{aligned}
P(\text { ball } k=6 \mid \text { three red }) & =\frac{P(\text { three red and ball } k=6)}{P(\text { three red })} \
& =\frac{\left(\begin{array}{c}
5 \
3
\end{array}\right)\left(\frac{k}{10}\right)^3\left(1-\frac{k}{10}\right)^2 \times \frac{1}{11}}{P(\text { three red })} \
& \approx \frac{0.03142}{0.1515} \approx 0.2074
\end{aligned}
$$

问题 2.
  1. A coin has probability $p$ of coming up heads and $1-p$ of tails, with $0<p<1$. An experiment is conducted with the following steps:
  2. Flip the coin.
  3. Flip the coin a second time.
  4. If both flips land on heads or both land on tails return to step 1.
  5. Otherwise let the result of the experiment be the result of the last flip at step 2.
    Assume flips are independent.
    (a) The $R$ function
    simulates this experiment, with 1 representing heads and 0 tails. Use this function to estimate the probability of heads for $p=0.2,0.4,0.6,0.8$.
    (b) Find the probability that the result of the experiment is a head mathematically as a function of $p$.

  1. (a) One possible approach:
    $$
    \begin{aligned}
    &>\operatorname{sapply}(\operatorname{seq}(0.2,0.9, \text { by }=0.2), \
    &\text { function(p) mean(replicate }(10000, \operatorname{sim} 1(p))))
    \end{aligned}
    $$
    [1] 0.49130 .49650 .50340 .4991
    This suggests that the probability of heads may be 0.5 for any $p$.
    (b) Let $A$ be the event that the process returns a head, and let $B$ be the event that the process ends after the first two flips. Then
    $$
    P(A)=P(A \cap B)+P\left(A \mid B^c\right) P\left(B^c\right)
    $$
    Now $A \cap B$ is the event that the first toss is a tail and the second toss is a head, so $P(A \cap B)=(1-p) p . B$ is the event that either the first toss is a head and the second a tail, or the first is a tail and the second is a head; so $P(B)=2 p(1-p)$ and $P\left(B^c\right)=1-2 p(1-p)$. If the process does not end with the first two tosses then it starts over again independently, so $P\left(A \mid B^c\right)=P(A)$. Therefore $P(A)$ satisfies
    $$
    P(A)=p(1-p)+P(A)(1-2 p(1-p))
    $$
    and thus
    $$
    P(A)=\frac{p(1-p)}{2 p(1-p)}=\frac{1}{2},
    $$
    as the simulation in part (a) suggests. The requirement that $p>0$ and $p<1$ ensures that the denominator is positive and that the process is guaranteed to end.

Textbooks


• An Introduction to Stochastic Modeling, Fourth Edition by Pinsky and Karlin (freely
available through the university library here)
• Essentials of Stochastic Processes, Third Edition by Durrett (freely available through
the university library here)
To reiterate, the textbooks are freely available through the university library. Note that
you must be connected to the university Wi-Fi or VPN to access the ebooks from the library
links. Furthermore, the library links take some time to populate, so do not be alarmed if
the webpage looks bare for a few seconds.

此图像的alt属性为空;文件名为%E7%B2%89%E7%AC%94%E5%AD%97%E6%B5%B7%E6%8A%A5-1024x575-10.png
EEL6029 Statistical inference

Statistics-lab™可以为您提供usf.edu EEL6029 Statistical inference统计推断课程的代写代考辅导服务! 请认准Statistics-lab™. Statistics-lab™为您的留学生涯保驾护航。

发表回复

您的电子邮箱地址不会被公开。 必填项已用 * 标注