数学代写|EE376A Information Theory

Statistics-lab™可以为您提供stanford.edu EE376A Information Theory信息论课程的代写代考辅导服务!

数学代写|EE376A Information Theory

EE376A Information Theory课程简介

This course is about how to measure, represent, and communicate information effectively. Why bits have become the universal currency for information exchange. How information theory bears on the design and operation of modern-day systems such as smartphones and the Internet. What are entropy and mutual information, and why are they so fundamental to data representation, communication, and inference. Practical compression and error correction. Relations and applications to probability, statistics, machine learning, biological and artificial neural networks, genomics, quantum information, and blockchains.

PREREQUISITES 

Lectures will focus on intuition, applications and ways in which communication and representation of information manifest in various areas. The material will be explored in more depth and rigor via videos of additional lectures (by the course instructors) made available to those interested. Homework and projects will be tailored to students’ backgrounds, interests, and goals. There will also be a fun outreach component.

We encourage everyone – from the techies to the literature majors – to enroll. Guaranteed learning, fun, contribution to social good, and new friendships with people from departments and schools other than your own. We’ll assume you’ve been exposed to basic probability at the level encountered in a first undergraduate course, or have the motivation to dedicate the first few weeks of the quarter to acquainting yourself (under our guidance) with this material.

EE376A Information Theory HELP(EXAM HELP, ONLINE TUTOR)

问题 1.
  1. Computing entropy. ( 2 points each) In this problem, you are to compute the entropy of various probability distributions, represented by sets of numbers in brackets. If there are $K$ numbers in brackets, then there are $K$ possible values of the corresponding random variable, whose probabilities are given by the numbers. In other words, a fair die would be written $\left[\frac{1}{6}, \frac{1}{6}, \frac{1}{6}, \frac{1}{6}, \frac{1}{6}, \frac{1}{6}\right]$. Compute the entropy of the following distributions (don’t use a calculator unless the problem says you can):
    a $\left[\frac{1}{4}, \frac{1}{4}, \frac{1}{4}, \frac{1}{4}\right]$
    $\mathrm{b}\left[\frac{1}{4}, \frac{1}{4}, \frac{1}{4}, \frac{1}{4}, 0,0,0\right]$
    c Compare answers a and b. Make a general statement about entropy calculations.
    d $\left[\frac{1}{4}, \frac{1}{4}, \frac{1.1}{4}, \frac{0.9}{4}\right]$ (you can use a calculator for this one)
    e Compare answers a and d. Can you offer a conjecture based upon these? (No need to prove it right now.)
    $\mathrm{f}\left[\frac{1}{8}, \frac{1}{8}, \frac{1}{8}, \frac{1}{8}, \frac{1}{8}, \frac{1}{8}, \frac{1}{8}, \frac{1}{8}\right]$
    $\mathrm{g}\left[\frac{1}{n}, \frac{1}{n}, \frac{1}{n}, \ldots, \frac{1}{n}\right]$

a) The entropy of distribution a is: \begin{align*} H &= -\sum_{i=1}^4 p_i\log_2(p_i) \ &= -\left(\frac{1}{4}\log_2\frac{1}{4} + \frac{1}{4}\log_2\frac{1}{4} + \frac{1}{4}\log_2\frac{1}{4} + \frac{1}{4}\log_2\frac{1}{4}\right) \ &= -\left(-2\right) \ &= 2 \end{align*}

b) The entropy of distribution b is: \begin{align*} H &= -\sum_{i=1}^7 p_i\log_2(p_i) \ &= -\left(\frac{1}{4}\log_2\frac{1}{4} + \frac{1}{4}\log_2\frac{1}{4} + \frac{1}{4}\log_2\frac{1}{4} + \frac{1}{4}\log_2\frac{1}{4} + 0\cdot\log_20 + 0\cdot\log_20 + 0\cdot\log_20\right) \ &= -\left(-1.5\right) \ &= 1.5 \end{align*}

c) The entropy of distribution b is less than the entropy of distribution a. Generally, the entropy of a distribution is higher when the probabilities are more spread out, and lower when the probabilities are more concentrated.

d) The entropy of distribution d is: \begin{align*} H &= -\sum_{i=1}^4 p_i\log_2(p_i) \ &= -\left(\frac{1}{4}\log_2\frac{1}{4} + \frac{1}{4}\log_2\frac{1}{4} + \frac{1.1}{4}\log_2\frac{1.1}{4} + \frac{0.9}{4}\log_2\frac{0.9}{4}\right) \ &\approx -\left(-1.96\right) \ &\approx 1.96 \end{align*}

e) The entropy of distribution d is lower than the entropy of distribution a. Based on these examples, we can conjecture that the entropy of a distribution decreases as the probabilities become more unevenly distributed.

f) The entropy of distribution f is: \begin{align*} H &= -\sum_{i=1}^8 p_i\log_2(p_i) \ &= -\left(\frac{1}{8}\log_2\frac{1}{8} + \frac{1}{8}\log_2\frac{1}{8} + \frac{1}{8}\log_2\frac{1}{8} + \frac{1}{8}\log_2\frac{1}{8} + \frac{1}{8}\log_2\frac{1}{8} + \frac{1}{8}\log_2\frac{1}{8} + \frac{1}{8}\log_2\frac{1}{8} + \frac{1}{8}\log_2\frac{1}{8}\right) \ &= -\left(-3\right) \ &= 3 \end{align*}

g) The entropy of distribution g is: \begin{align*

问题 2.

A die is labeled from 1 to 6. Assuming it’s fair, what is the entropy of the roll (you can leave a “log” in your answer)? The die is relabeled with the even numbers from 2 to 12 . What is its entropy now? ( 2 points)

The entropy of a fair die labeled from 1 to 6 is:

H = -\sum_{i=1}^6 p_i \log_2 p_i = -\sum_{i=1}^6 \frac{1}{6} \log_2 \frac{1}{6} = \log_2 6 \approx 2.585H=−i=1∑6​pi​log2​pi​=−i=1∑6​61​log2​61​=log2​6≈2.585

where $p_i = \frac{1}{6}$ is the probability of rolling each number on the die.

When the die is relabeled with the even numbers from 2 to 12, each outcome has probability $\frac{1}{6}$ since the die is still fair. There are six possible outcomes, corresponding to rolling 2, 4, 6, 8, 10, or 12. Thus, the entropy of the roll is:

H = -\sum_{i=1}^6 p_i \log_2 p_i = -\sum_{i=1}^6 \frac{1}{6} \log_2 \frac{1}{6} = \log_2 6 \approx 2.585H=−i=1∑6​pi​log2​pi​=−i=1∑6​61​log2​61​=log2​6≈2.585

The entropy is the same as for the original die because the two distributions have the same number of possible outcomes, and the probabilities of those outcomes are the same. In general, the entropy of a distribution depends only on the probabilities of the outcomes, not on their labels or any other properties.

Therefore, the entropy of the roll of a fair die labeled from 1 to 6 or even numbers from 2 to 12 is $\log_2 6 \approx 2.585$.

Textbooks


• An Introduction to Stochastic Modeling, Fourth Edition by Pinsky and Karlin (freely
available through the university library here)
• Essentials of Stochastic Processes, Third Edition by Durrett (freely available through
the university library here)
To reiterate, the textbooks are freely available through the university library. Note that
you must be connected to the university Wi-Fi or VPN to access the ebooks from the library
links. Furthermore, the library links take some time to populate, so do not be alarmed if
the webpage looks bare for a few seconds.

此图像的alt属性为空;文件名为%E7%B2%89%E7%AC%94%E5%AD%97%E6%B5%B7%E6%8A%A5-1024x575-10.png
EE376A Information Theory

Statistics-lab™可以为您提供stanford.edu EE376A Information Theory信息论课程的代写代考辅导服务! 请认准Statistics-lab™. Statistics-lab™为您的留学生涯保驾护航。

发表回复

您的电子邮箱地址不会被公开。 必填项已用 * 标注