## EE261 Fourier analysis课程简介

The goals for the course are to gain a facility with using the Fourier transform, both specific techniques and general principles, and learning to recognize when, why, and how it is used. Together with a great variety, the subject also has a great coherence, and the hope is students come to appreciate both.

## PREREQUISITES

Topics include: The Fourier transform as a tool for solving physical problems. Fourier series, the Fourier transform of continuous and discrete signals and its properties. The Dirac delta, distributions, and generalized transforms. Convolutions and correlations and applications; probability distributions, sampling theory, filters, and analysis of linear systems. The discrete Fourier transform and the FFT algorithm. Multidimensional Fourier transform and use in imaging. Further applications to optics, crystallography. Emphasis is on relating the theoretical principles to solving practical engineering and science problems.

## EE261 Fourier analysis HELP（EXAM HELP， ONLINE TUTOR）

1. Let us define
$$J_0(x)=\frac{1}{2 \pi} \int_0^{2 \pi} \cos (x \sin (\theta)) d \theta$$
Show that
$$f(x) \mapsto F(\xi)=2 \pi \int J_0(2 \pi \xi x) f(x) x d x$$
defines a unitary map from $L^2([0, \infty), r d r)$ to itself. Describe the relation to the Fourier transform of radial functions in two dimensions.

The entropy of a fair die labeled from 1 to 6 is:

H = -\sum_{i=1}^6 p_i \log_2 p_i = -\sum_{i=1}^6 \frac{1}{6} \log_2 \frac{1}{6} = \log_2 6 \approx 2.585H=−i=1∑6​pi​log2​pi​=−i=1∑6​61​log2​61​=log2​6≈2.585

where $p_i = \frac{1}{6}$ is the probability of rolling each number on the die.

When the die is relabeled with the even numbers from 2 to 12, each outcome has probability $\frac{1}{6}$ since the die is still fair. There are six possible outcomes, corresponding to rolling 2, 4, 6, 8, 10, or 12. Thus, the entropy of the roll is:

H = -\sum_{i=1}^6 p_i \log_2 p_i = -\sum_{i=1}^6 \frac{1}{6} \log_2 \frac{1}{6} = \log_2 6 \approx 2.585H=−i=1∑6​pi​log2​pi​=−i=1∑6​61​log2​61​=log2​6≈2.585

The entropy is the same as for the original die because the two distributions have the same number of possible outcomes, and the probabilities of those outcomes are the same. In general, the entropy of a distribution depends only on the probabilities of the outcomes, not on their labels or any other properties.

Therefore, the entropy of the roll of a fair die labeled from 1 to 6 or even numbers from 2 to 12 is $\log_2 6 \approx 2.585$.

1. Let $d \rho$ be a probability measure on $\mathbb{R}$ with $\int x d \mu(x)=0$ and $\int x^4 d \rho(x)<\infty$. Prove the central limit theorem for the sum of independent random variables with this distribution.

Specifically, if $X_1, X_2, \ldots$ are $d \rho$-distributed, show that for any Schwartz function $f$
$$\mathbb{E}\left{f\left(\frac{X_1+\cdots+X_n}{n^{1 / 2}}\right)\right} \longrightarrow \frac{1}{\sqrt{2 \pi \sigma^2}} \int \exp \left{-\frac{x^2}{2 \sigma^2}\right} f(x) d x$$
as $n \rightarrow \infty$. Hint: first show convergence for $f(x)=e^{-2 \pi i x \xi}$ uniformly for $\xi$ in a compact set.

To prove the central limit theorem for the sum of independent random variables with probability measure $d\rho$, we will use characteristic functions.

Let $X_1, X_2, \ldots$ be independent random variables with probability measure $d\rho$. Then the characteristic function of $X_i$ is given by $\phi_i(t) = \mathbb{E}[e^{itX_i}]$.

By independence, the characteristic function of $S_n = \frac{1}{\sqrt{n}}(X_1 + \cdots + X_n)$ is the product of the characteristic functions of the individual random variables:

$$\phi_{S_n}(t) = \prod_{i=1}^n \phi_i\left(\frac{t}{\sqrt{n}}\right)$$

Let $\hat{f}$ denote the Fourier transform of the Schwartz function $f$. Then, using the definition of the Fourier transform and the fact that $\int x d\rho(x) = 0$, we can write:

\begin{align*} \mathbb{E}[f(S_n)] &= \int_{\mathbb{R}} f\left(\frac{x_1+\cdots+x_n}{\sqrt{n}}\right) d\rho(x_1) \cdots d\rho(x_n)\ &= \frac{1}{(2\pi)^{n/2}} \int_{\mathbb{R}^n} \hat{f}\left(\frac{\xi_1+\cdots+\xi_n}{\sqrt{n}}\right) \prod_{i=1}^n \phi_i(\xi_i) d\xi_1 \cdots d\xi_n \ &= \frac{1}{(2\pi)^{n/2}} \int_{\mathbb{R}^n} \hat{f}\left(\frac{\xi_1+\cdots+\xi_n}{\sqrt{n}}\right) \exp\left(i\sum_{i=1}^n \frac{t\xi_i}{\sqrt{n}}\right) \prod_{i=1}^n

## Textbooks

• An Introduction to Stochastic Modeling, Fourth Edition by Pinsky and Karlin (freely
available through the university library here)
• Essentials of Stochastic Processes, Third Edition by Durrett (freely available through
the university library here)
To reiterate, the textbooks are freely available through the university library. Note that
you must be connected to the university Wi-Fi or VPN to access the ebooks from the library
links. Furthermore, the library links take some time to populate, so do not be alarmed if
the webpage looks bare for a few seconds.

Statistics-lab™可以为您提供stanford.edu EE261 Fourier analysis傅里叶分析课程的代写代考辅导服务！ 请认准Statistics-lab™. Statistics-lab™为您的留学生涯保驾护航。