数学代写|AM221 Convex optimization

Statistics-lab™可以为您提供harvard.edu AM221 Convex optimization凸优化课程的代写代考辅导服务!

数学代写|AM221 Convex optimization

AM221 Convex optimization课程简介

This is a graduate-level course on optimization. The course covers mathematical programming and combinatorial optimization from the perspective of convex optimization, which is a central tool for solving large-scale problems. In recent years convex optimization has had a profound impact on statistical machine learning, data analysis, mathematical finance, signal processing, control, theoretical computer science, and many other areas. The first part will be dedicated to the theory of convex optimization and its direct applications. The second part will focus on advanced techniques in combinatorial optimization using machinery developed in the first part.

PREREQUISITES 

Instructor: Yaron Singer (OH: Wednesdays 4-5pm, MD239)
Teaching fellows:

  • Thibaut Horel (OH: Tuesdays 4:30-5:30pm, MD’s 2nd floor lounge)
  • Rajko Radovanovic (OH: Mondays 5:30-6:30pm, MD’s 2nd floor lounge)
    Time: Monday \& Wednesday, 2:30pm-4:00pm
    Room: MD119
    Sections: Wednesdays 5:30-7:00pm in MD221

AM221 Convex optimization HELP(EXAM HELP, ONLINE TUTOR)

问题 1.

Bi-criterion optimization. Figure 4.11 shows the optimal trade-off curve and the set of achievable values for the bi-criterion optimization problem
$$
\text { minimize (w.r.t. } \left.\mathbf{R}{+}^2\right) \quad\left(|A x-b|^2,|x|_2^2\right) $$ for some $A \in \mathbf{R}^{100 \times 10}, b \in \mathbf{R}^{100}$. Answer the following questions using information from the plot. We denote by $x{1 \mathrm{~s}}$ the solution of the least-squares problem
$$
\text { minimize }|A x-b|_2^2
$$
(a) What is $\left|x_{1 s}\right|_2$ ?
(b) What is $\left|A x_{1 s}-b\right|_2$ ?
(c) What is $|b|_2$ ?

(a) From the plot, we can see that the optimal point for the least-squares problem corresponds to a value of $|x|2^2$ of approximately 1.0. Therefore, we have $\left|x{1 s}\right|_2 \approx \sqrt{1.0} = 1.0$.

(b) From the plot, we can see that the optimal point for the least-squares problem corresponds to a value of $|A x – b|^2$ of approximately 5.0. Therefore, we have $\left|A x_{1 s} – b\right|_2 \approx \sqrt{5.0} \approx 2.236$.

(c) The plot does not provide information about the norm of $b$.

问题 2.

Self-concordance and negative entropy.
(a) Show that the negative entropy function $x \log x$ (on $\mathbf{R}{++}$) is not self-concordant. (b) Show that for any $t>0, t x \log x-\log x$ is self-concordant (on $\left.\mathbf{R}{++}\right)$.

(a) To show that the negative entropy function $x \log x$ is not self-concordant, we need to show that it does not satisfy the self-concordance inequality. Let $f(x) = x \log x$ and $f”(x) = \frac{1}{x}$ be the second derivative of $f(x)$. Then, the self-concordance inequality is given by:

\left|f”'(x)\right| \leq 2\left(f”(x)\right)^{\frac{3}{2}} \quad \text{for all } x > 0∣f′′′(x)∣≤2(f′′(x))23​for all x>0

Taking the third derivative of $f(x)$, we have:

f”'(x) = -\frac{1}{x^2}f′′′(x)=−x21​

Substituting this back into the self-concordance inequality, we get:

\left|\frac{-1}{x^2}\right| \leq 2 \left(\frac{1}{x}\right)^{\frac{3}{2}}∣∣​x2−1​∣∣​≤2(x1​)23​

Simplifying, we have:

\frac{1}{x^2} \leq 2\sqrt{\frac{1}{x}}x21​≤2x1​​

Multiplying both sides by $x^{\frac{3}{2}}$ and simplifying, we get:

1 \leq 2\sqrt{x}1≤2x

Squaring both sides, we get:

1 \leq 4×1≤4x

However, this inequality does not hold for all $x > 0$, since $x \log x$ approaches 0 as $x$ approaches 0. Therefore, the negative entropy function $x \log x$ is not self-concordant.

(b) To show that $tx\log x – \log x$ is self-concordant, we need to show that it satisfies the self-concordance inequality. Let $f(x) = tx\log x – \log x$ and $f”(x) = t \frac{1 + \log x}{x^2}$ be the second derivative of $f(x)$. Then, the self-concordance inequality is given by:

\left|f”'(x)\right| \leq 2\left(f”(x)\right)^{\frac{3}{2}} \quad \text{for all } x > 0∣f′′′(x)∣≤2(f′′(x))23​for all x>0

Taking the third derivative of $f(x)$, we have:

f”'(x) = -t\frac{2+\log x}{x^3}f′′′(x)=−tx32+logx​

Substituting this back into the self-concordance inequality, we get:

\left|-\frac{t(2+\log x)}{x^3}\right| \leq 2 \left(t\frac{1+\log x}{x^2}\right)^{\frac{3}{2}}∣∣​−x3t(2+logx)​∣∣​≤2(tx21+logx​)23​

Simplifying, we have:

\frac{t(2+\log x)}{x^3} \leq 2t\frac{(1+\log x)^{\frac{3}{2}}}{x^3}x3t(2+logx)​≤2tx3(1+logx)23​​

Dividing both sides by $t\frac{(1+\log x)^{\frac{3}{2}}}{x^3}$ and simplifying, we get:

\frac{2+\log x}{(1+\log x)^{\frac{3}{2}}} \leq 2(1+logx)23​2+logx​≤2

Squaring both sides and simplifying, we get:

Textbooks


• An Introduction to Stochastic Modeling, Fourth Edition by Pinsky and Karlin (freely
available through the university library here)
• Essentials of Stochastic Processes, Third Edition by Durrett (freely available through
the university library here)
To reiterate, the textbooks are freely available through the university library. Note that
you must be connected to the university Wi-Fi or VPN to access the ebooks from the library
links. Furthermore, the library links take some time to populate, so do not be alarmed if
the webpage looks bare for a few seconds.

此图像的alt属性为空;文件名为%E7%B2%89%E7%AC%94%E5%AD%97%E6%B5%B7%E6%8A%A5-1024x575-10.png
AM221 Convex optimization

Statistics-lab™可以为您提供harvard.edu AM221 Convex optimization现代代数课程的代写代考辅导服务! 请认准Statistics-lab™. Statistics-lab™为您的留学生涯保驾护航。

发表回复

您的电子邮箱地址不会被公开。 必填项已用 * 标注