数学代写|EE376A information theory

Statistics-lab™可以为您提供stanford.edu EE376A information theory信息论的代写代考和辅导服务!

EE376A information theory课程简介

Information theory is the branch of mathematics that deals with the quantification, storage, and communication of information. It was developed by Claude Shannon in the 1940s, and has applications in a wide range of fields, including communications, computer science, physics, and biology.

The basic idea in information theory is that information can be thought of as a reduction in uncertainty. The more uncertain we are about something, the more information we gain when we learn about it. For example, if I tell you that the weather tomorrow will be either sunny or rainy, and you don’t know which one, then your uncertainty about the weather is high. If I then tell you that the weather will be sunny, then your uncertainty is reduced, and you gain some information.

PREREQUISITES 

Information theory is the branch of mathematics that deals with the quantification, storage, and communication of information. It was developed by Claude Shannon in the 1940s, and has applications in a wide range of fields, including communications, computer science, physics, and biology.

The basic idea in information theory is that information can be thought of as a reduction in uncertainty. The more uncertain we are about something, the more information we gain when we learn about it. For example, if I tell you that the weather tomorrow will be either sunny or rainy, and you don’t know which one, then your uncertainty about the weather is high. If I then tell you that the weather will be sunny, then your uncertainty is reduced, and you gain some information.

EE376A information theory HELP(EXAM HELP, ONLINE TUTOR)

问题 1.

Zero conditional entropy. Show that if $H(Y \mid X)=0$, then $Y$ is a function of $X$, i.e., for all $x$ with $p(x)>0$, there is only one possible value of $y$ with $p(x, y)>0$.
Solution: Zero Conditional Entropy. Assume that there exists an $x$, say $x_0$ and two different values of $y$, say $y_1$ and $y_2$ such that $p\left(x_0, y_1\right)>0$ and $p\left(x_0, y_2\right)>0$. Then $p\left(x_0\right) \geq p\left(x_0, y_1\right)+p\left(x_0, y_2\right)>0$, and $p\left(y_1 \mid x_0\right)$ and $p\left(y_2 \mid x_0\right)$ are not equal to 0 or 1 . Thus
$$
\begin{aligned}
H(Y \mid X) & =-\sum_x p(x) \sum_y p(y \mid x) \log p(y \mid x) \
& \geq p\left(x_0\right)\left(-p\left(y_1 \mid x_0\right) \log p\left(y_1 \mid x_0\right)-p\left(y_2 \mid x_0\right) \log p\left(y_2 \mid x_0\right)\right) \

& >0,
\end{aligned}
$$
since $-t \log t \geq 0$ for $0 \leq t \leq 1$, and is strictly positive for $t$ not equal to 0 or 1 . Therefore the conditional entropy $H(Y \mid X)$ is 0 if and only if $Y$ is a function of $X$.

问题 2.

Data processing. Let $X_1 \rightarrow X_2 \rightarrow X_3 \rightarrow \cdots \rightarrow X_n$ form a Markov chain in this order; i.e., let
$$
p\left(x_1, x_2, \ldots, x_n\right)=\bar{p}\left(x_1\right) p\left(x_2 \mid x_1\right) \cdots p\left(x_n \mid x_{n-1}\right) .
$$
Reduce $I\left(X_1 ; X_2, \ldots, X_n\right)$ to its simplest form.
Solution: Data Processing. By the chain rule for mutual information,
$$
I\left(X_1 ; X_2, \ldots, X_n\right)=I\left(X_1 ; X_2\right)+I\left(X_1 ; X_3 \mid X_2\right)+\cdots+I\left(X_1 ; X_n \mid X_2, \ldots, X_{n-2}\right) \text {. (2.95) }
$$
By the Markov property, the past and the future are conditionally independent given the present and hence all terms except the first are zero. Therefore
$$
I\left(X_1 ; X_2, \ldots, X_n\right)=I\left(X_1 ; X_2\right)
$$

Textbooks


• An Introduction to Stochastic Modeling, Fourth Edition by Pinsky and Karlin (freely
available through the university library here)
• Essentials of Stochastic Processes, Third Edition by Durrett (freely available through
the university library here)
To reiterate, the textbooks are freely available through the university library. Note that
you must be connected to the university Wi-Fi or VPN to access the ebooks from the library
links. Furthermore, the library links take some time to populate, so do not be alarmed if
the webpage looks bare for a few seconds.

此图像的alt属性为空;文件名为%E7%B2%89%E7%AC%94%E5%AD%97%E6%B5%B7%E6%8A%A5-1024x575-10.png
数学代写|EE376A information theory

Statistics-lab™可以为您提供stanford.edu EE376A information theory信息论的代写代考和辅导服务! 请认准Statistics-lab™. Statistics-lab™为您的留学生涯保驾护航。

发表回复

您的电子邮箱地址不会被公开。 必填项已用 * 标注