标签: PHY525

物理代写|PHYS6562 Statistical Physics

Statistics-lab™可以为您提供cornell.edu PHYS6562 Statistical Physics统计物理的代写代考辅导服务!

PHYS6562 Statistical Physics课程简介

The course presumes a high level of sophistication, equivalent to but not necessarily the same as that of a first-year physics graduate student (undergrad-level quantum, classical mechanics, and thermodynamics). Only a small portion of the course (roughly one and a half weeks) will demand a knowledge of quantum mechanics; students with no quantum background have found the rest of the course comprehensible and useful, if challenging. Primarily for graduate students.

PREREQUISITES 

The course presumes a high level of sophistication, equivalent to but not necessarily the same as that of a first-year physics graduate student (undergrad-level quantum, classical mechanics, and thermodynamics). Only a small portion of the course (roughly one and a half weeks) will demand a knowledge of quantum mechanics; students with no quantum background have found the rest of the course comprehensible and useful, if challenging. Primarily for graduate students.

J. Sethna.

A broad, graduate level view of statistical mechanics, with applications to not only physics and chemistry, but to computation, mathematics, dynamical and complex systems, and biology. Some traditional focus areas will not be covered in detail (thermodynamics, phase diagrams, perturbative methods, interacting gasses and liquids).

PHYS6562 Statistical Physics HELP(EXAM HELP, ONLINE TUTOR)

问题 1.

A dynamical state of a biopolymer undergoes transition between two states 1 and 2 , with rates given by $R_{1 \rightarrow 2}=C e^{-\Delta f_{12} / k_6 T}$ and $R_{2 \rightarrow 1}=C e^{-\Delta f_{21} / k_B T}$, where $\Delta f_{i j}$ is the free energy barrier for the transitions. Suppose that $\Delta f_{12}=\Delta e-T \Delta s, \Delta f_{21}=$ De where $\Delta e>0$ and $\Delta s$ is the internal energy and entropy changes that are independent of temperature T. Apply a weak oscillating force of the frequency $\omega$ that couples with the reaction coordinate. If the force induces a stochastic resonance, what is the optimal value for the noise strength $k_B T$ ? How does it depend on the entropy changes $\Delta s$ ?

问题 2.

Show that the pressure of a stationary ideal gas satisfying $p_0=\rho k_B T / \mathrm{m}$ at $x$ under a uniform gravity $g$ along the $x$-axis is given by
$$
p(x)=p_0 \exp \left(-m g x /\left(k_B T\right)\right)
$$

问题 3.

A flexible polymer is anchored on planar surface at one end is subject to a Couette flow with a shear rate $\dot{\gamma}$, as shown in the figure below. Study the conformation of the chain in a steady state by finding (i) the mean square of EED $\left\langle\boldsymbol{R}^2\right\rangle$. where the average is taken over the steady state. (ii) Find $\langle\boldsymbol{r}(s, t)\rangle$.

问题 4.

Use the scaling argument above to show that the $C M$ diffusion constant and EED relaxation time for the chain in good solvents with $R_g \sim N^v$ are given by
$$
D_c \sim N^{-v} \quad \text { and } \quad \tau_Z \sim N^{3 v}
$$

Textbooks


• An Introduction to Stochastic Modeling, Fourth Edition by Pinsky and Karlin (freely
available through the university library here)
• Essentials of Stochastic Processes, Third Edition by Durrett (freely available through
the university library here)
To reiterate, the textbooks are freely available through the university library. Note that
you must be connected to the university Wi-Fi or VPN to access the ebooks from the library
links. Furthermore, the library links take some time to populate, so do not be alarmed if
the webpage looks bare for a few seconds.

此图像的alt属性为空;文件名为%E7%B2%89%E7%AC%94%E5%AD%97%E6%B5%B7%E6%8A%A5-1024x575-10.png
物理代写|PHYS6562 Statistical Physics

Statistics-lab™可以为您提供cornell.edu PHYS6562 Statistical Physics统计物理的代写代考辅导服务! 请认准Statistics-lab™. Statistics-lab™为您的留学生涯保驾护航。

物理代写|PHYS6562 Statistical Physics

Statistics-lab™可以为您提供cornell.edu PHYS6562 Statistical Physics统计物理的代写代考辅导服务!

PHYS6562 Statistical Physics课程简介

The course presumes a high level of sophistication, equivalent to but not necessarily the same as that of a first-year physics graduate student (undergrad-level quantum, classical mechanics, and thermodynamics). Only a small portion of the course (roughly one and a half weeks) will demand a knowledge of quantum mechanics; students with no quantum background have found the rest of the course comprehensible and useful, if challenging. Primarily for graduate students.

PREREQUISITES 

The course presumes a high level of sophistication, equivalent to but not necessarily the same as that of a first-year physics graduate student (undergrad-level quantum, classical mechanics, and thermodynamics). Only a small portion of the course (roughly one and a half weeks) will demand a knowledge of quantum mechanics; students with no quantum background have found the rest of the course comprehensible and useful, if challenging. Primarily for graduate students.

J. Sethna.

A broad, graduate level view of statistical mechanics, with applications to not only physics and chemistry, but to computation, mathematics, dynamical and complex systems, and biology. Some traditional focus areas will not be covered in detail (thermodynamics, phase diagrams, perturbative methods, interacting gasses and liquids).

PHYS6562 Statistical Physics HELP(EXAM HELP, ONLINE TUTOR)

问题 1.

(1) The information entropy of a distribution $\left{p_n\right}$ is defined as $S=-\sum_n p_n \log _2 p_n$, where $n$ ranges over all possible configurations of a given physical system and $p_n$ is the probability of the state $|n\rangle$. If there are $\Omega$ possible states and each state is equally likely, then $S=\log _2 \Omega$, which is the usual dimensionless entropy in units of $\ln 2$.

Consider a normal deck of 52 distinct playing cards. A new deck always is prepared in the same order (A, $2 \boldsymbol{W} \cdots$ K $)$.
(a) What is the information entropy of the distribution of new decks?
(b) What is the information entropy of a distribution of completely randomized decks?
Now consider what it means to shuffle the cards. In an ideal riffle shuffle, the deck is split and divided into two equal halves of 26 cards each. One then chooses at random whether to take a card from either half, until one runs through all the cards and a new order is established (see figure).

(c) What is the increase in information entropy for a distribution of new decks that each have been shuffled once?
(d) Assuming each subsequent shuffle results in the same entropy increase (i.e. neglecting redundancies), how many shuffles are necessary in order to completely randomize a deck?
(e) If in parts (b), (c), and (d), you were to use Stirling’s approximation,
$$
K ! \sim K^K e^{-K} \sqrt{2 \pi K}
$$
how would your answers have differed?

(a) Since each new deck arrives in the same order, we have $p_1=1$ while $p_{2, \ldots, 52 !}=0$. Therefore $S=0$.
(b) For completely randomized decks, $p_n=1 / \Omega$ with $n \in{1, \ldots, \Omega}$ and $\Omega=52$ !, the total number of possible configurations. Thus, $S_{\text {random }}=\log 2 52 !=225.581$. (c) After one riffle shuffle, there are $\Omega=\left(\begin{array}{c}52 \ 26\end{array}\right)$ possible configurations. If all such configurations were equally likely, we would have $(\Delta S){\text {rifle }}=\log 2\left(\begin{array}{l}52 \ 26\end{array}\right)=48.817$. However, they are not all equally likely. For example, the probability that we drop the entire left-half deck and then the entire right half-deck is $2^{-26}$. After the last card from the left half-deck is dropped, we have no more choices to make. On the other hand, the probability for the sequence LRLR $\cdots$ is $2^{-51}$, because it is only after the $51^{\text {st }}$ card is dropped that we have no more choices. We can derive an exact expression for the entropy of the riffle shuffle in the following manner. Consider a deck of $N=2 K$ cards. The probability that we run out of choices after $K$ cards is the probability of the first $K$ cards dropped being all from one particular half-deck, which is $2 \cdot 2^{-K}$. Now let’s ask what is the probability that we run out of choices after $(K+1)$ cards are dropped. If all the remaining $(K-1)$ cards are from the right half-deck, this means that we must have one of the $\mathrm{R}$ cards among the first $K$ dropped. Note that this $\mathrm{R}$ card cannot be the $(K+1)^{\text {th }}$ card dropped, since then all of the first $K$ cards are $\mathrm{L}$, which we have already considered. Thus, there are $\left(\begin{array}{c}K \ 1\end{array}\right)=K$ such configurations, each with a probability $2^{-K-1}$. Next, suppose we run out of choices after $(K+2)$ cards are dropped. If the remaining $(K-2)$ cards are $\mathrm{R}$, this means we must have 2 of the $\mathrm{R}$ cards among the first $(K+1)$ dropped, which means $\left({ }^{K+1}{ }_2\right)$ possibilities. Note that the $(K+2)^{\text {th }}$ card must be $\mathrm{L}$, since if it were $\mathrm{R}$ this would mean that the last $(K-1)$ cards are $\mathrm{R}$, which we have already considered. Continuing in this manner, we conclude $$ \Omega_K=2 \sum{n=0}^K\left(\begin{array}{c}
K+n-1 \
n
\end{array}\right)=\left(\begin{array}{c}
2 K \
K
\end{array}\right)
$$
and
$$
S_K=-\sum_{a=1}^{\Omega_K} p_a \log 2 p_a=\sum{n=0}^{K-1}\left(\begin{array}{c}
K+n-1 \
n
\end{array}\right) \cdot 2^{-(K+n)} \cdot(K+n) .
$$
The results are tabulated below in Table 1. For a deck of 52 cards, the actual entropy per riffle shuffle is $S_{26}=46.274$.
(d) Ignoring redundancies, we require $k=S_{\text {random }} /(\Delta S)_{\text {riftle }}=4.62$ shuffles if we assume all riffle outcomes are equally likely, and 4.88 if we use the exact result for the riffle entropy. Since there are no fractional shuffles, we round up to $k=5$ in both cases. In fact, computer experiments show that the answer is $k=9$. The reason we are so far off is that we have ignored redundancies, i.e. we have assumed that all the states produced by two consecutive riffle shuffles are distinct. They are not! For decks with asymptotically large

问题 2.

(2) In problem #1, we ran across Stirling’s approximation,
$$
\ln K ! \sim K \ln K-K+\frac{1}{2} \ln (2 \pi K)+\mathcal{O}\left(K^{-1}\right),
$$
for large $K$. In this exercise, you will derive this expansion.
(a) Start by writing
$$
K !=\int_0^{\infty} d x x^K e^{-x},
$$
and define $x \equiv K(t+1)$ so that $K !=K^{K+1} e^{-K} F(K)$, where
$$
F(K)=\int_{-1}^{\infty} d t e^{K f(t)}
$$
Find the function $f(t)$.
(b) Expand $f(t)=\sum_{n=0}^{\infty} f_n t^n$ in a Taylor series and find a general formula for the expansion coefficients $f_n$. In particular, show that $f_0=f_1=0$ and that $f_2=-\frac{1}{2}$.
(c) If one ignores all the terms but the lowest order (quadratic) in the expansion of $f(t)$, show that
$$
\int_{-1}^{\infty} d t e^{-K t^2 / 2}=\sqrt{\frac{2 \pi}{K}}-R(K),
$$
and show that the remainder $R(K)>0$ is bounded from above by a function which decreases faster than any polynomial in $1 / K$.
(d) For the brave only! – Find the $\mathcal{O}\left(K^{-1}\right)$ term in the expansion for $\ln K$ !.

(a) Setting $x=K(t+1)$, we have
$$
K !=K^{K+1} e^{-K} \int_{-1}^{\infty} d t(t+1)^K e^{-t}
$$
hence $f(t)=\ln (t+1)-t$
(b) The Taylor expansion of $f(t)$ is
$$
f(t)=-\frac{1}{2} t^2+\frac{1}{3} t^3-\frac{1}{4} t^4+\ldots
$$
(c) Retaining only the leading term in the Taylor expansion of $f(t)$, we have
$$


\begin{aligned}
F(K) & \simeq \int_{-1}^{\infty} d t e^{-K t^2 / 2} \
& =\sqrt{\frac{2 \pi}{K}}-\int_1^{\infty} d t e^{-K t^2 / 2} .
\end{aligned}
$$
Writing $t \equiv s+1$, the remainder is found to be
$$
R(K)=e^{-K / 2} \int_0^{\infty} d s e^{-K s^2 / 2} e^{-K s}<\sqrt{\frac{\pi}{2 K}} e^{-K / 2},
$$
which decreases exponentially with $K$, faster than any power.
(d) We have
$$
\begin{aligned}
F(K) & =\int_{-1}^{\infty} d t e^{-\frac{1}{2} K t^2} e^{\frac{1}{3} K t^3-\frac{1}{4} K t^4+\ldots} \
& =\int_{-1}^{\infty} d t e^{-\frac{1}{2} K t^2}\left{1+\frac{1}{3} K t^3-\frac{1}{4} K t^4+\frac{1}{18} K^2 t^6+\ldots\right} \
& =\sqrt{\frac{2 \pi}{K}} \cdot\left{1-\frac{3}{4} K^{-1}+\frac{5}{6} K^{-1}+\mathcal{O}\left(K^{-2}\right)\right}
\end{aligned}
$$
Thus,
$$
\ln K !=K \ln K-K+\frac{1}{2} \ln K+\frac{1}{2} \ln (2 \pi)+\frac{1}{12} K^{-1}+\mathcal{O}\left(K^{-2}\right)
$$

Textbooks


• An Introduction to Stochastic Modeling, Fourth Edition by Pinsky and Karlin (freely
available through the university library here)
• Essentials of Stochastic Processes, Third Edition by Durrett (freely available through
the university library here)
To reiterate, the textbooks are freely available through the university library. Note that
you must be connected to the university Wi-Fi or VPN to access the ebooks from the library
links. Furthermore, the library links take some time to populate, so do not be alarmed if
the webpage looks bare for a few seconds.

此图像的alt属性为空;文件名为%E7%B2%89%E7%AC%94%E5%AD%97%E6%B5%B7%E6%8A%A5-1024x575-10.png
物理代写|PHYS6562 Statistical Physics

Statistics-lab™可以为您提供cornell.edu PHYS6562 Statistical Physics统计物理的代写代考辅导服务! 请认准Statistics-lab™. Statistics-lab™为您的留学生涯保驾护航。