Week 1
Unconstrained problem
min 𝑓(𝑥) where 𝑥 ∈ ℝ" is a real vector with 𝑛 ≥ 1 components and 𝑓: ℝ" → ℝ is a function
!
(usually smooth)
Smooth function
Function 𝑓 is smooth if it has derivatives of all orders everywhere on its domain. Almost
every function without ‘kinks’ is smooth
Difference inf and min
inf 𝑓(𝑥), largest value 𝑘 such that 𝑓(𝑥) ≥ 𝑘 for all 𝑥 ∈ 𝑆, exists always
#∈%
min 𝑓(𝑥), smallest value 𝑘 such that there exists a 𝑥 ∈ 𝑆 for which 𝑓(𝑥) = 𝑘, may not exist
#∈%
Weierstrass theorem – informal
A continuous function on a restricted domain has both a global maximum and a global
minimum. Restricted domain has to be made formal
Closed and bounded sets
A set is closed if all limit points of the set are contained in the set. Informally: “all borders
are included”.
A set is a bounded set if there exists an 0 < 𝑀 < ∞ such that for every 𝑥 in the set it holds
that −𝑀 ≤ 𝑥& ≤ 𝑀 ∀𝑖. Informally: “the set fits in a box”.
A compact set is one that is both closed and bounded
Weierstrass theorem
A continuous function on a nonempty, closed and bounded (compact) set in ℝ" attains its
global maximum and minimum
Fermat theorem
If 𝑥 ∗ is a local minimizer and 𝑓 is differentiable at 𝑥 ∗ , then ∇𝑓(𝑥 ∗ ) = 0. The Fermat theorem
gives first order necessary condition (FONC) for unconstrained minimization
4-step method
1. Model the problem and establish existence of global minima (Weierstrass theorem)
2. Write down first order necessary conditions (FONC), e.g. using Fermat theorem
3. Investigate all points of interest (points that meet FONC, and boundary points)
4. Give a conclusion
Convex sets
A set 𝑆 ⊆ ℝ" is a convex set if the straight-line segment connecting any two points in 𝑆 lies
entirely inside 𝑆. 𝑥, 𝑦 ∈ 𝑆 ⇒ 𝛼𝑥 + (1 − 𝛼)𝑦 ∈ 𝑆 ∀𝛼 ∈ [0,1]
Function
𝑓 is a convex function if its domain 𝑆 is convex, and 𝑥, 𝑦 ∈ 𝑆 ⇒
𝑓(𝛼𝑥 + (1 − 𝛼)𝑦) ≤ 𝛼𝑓(𝑥) + (1 − 𝛼)𝑓(𝑦) ∀𝛼 ∈ [0,1].
𝑓 is a convex function if its domain 𝑆 is convex, and 𝑒𝑝𝑖(𝑓) is a convex set (epi = epigraph)
Tests for convexity
Given convex functions 𝑓 and 𝑔:
- 𝑓(𝑥) + 𝑔(𝑥) is convex
- 𝑓(𝐴𝑥 + 𝑏) is convex
- 𝛼𝑓(𝑥) is convex ∀𝛼 ≥ 0
- max{𝑓(𝑥), 𝑔(𝑥)} is convex (may ruin smoothness)
- 𝑔N𝑓(𝑥)O is convex if 𝑓 is convex and 𝑔 is convex and nondecreasing on ℝ → ℝ
- Given concave function ℎ: −ℎ(𝑥) is convex
, Hessian
Given a twice differentiable function 𝑓 on domain 𝑆, the Hessian ∇( 𝑓(𝑥) is defined by
𝜕2 𝑓 𝜕2 𝑓 𝜕2 𝑓
⎡ 𝜕𝑥2 ⋯ ⎤
𝜕𝑥1 𝜕𝑥2 𝜕𝑥1 𝜕𝑥𝑛
⎢ 21 ⎥
𝜕 𝑓 𝜕2 𝑓 𝜕2 𝑓
⎢ ⋯ ⎥
∇( 𝑓(𝑥) = ⎢𝜕𝑥2 𝜕𝑥1 𝜕𝑥22 𝜕𝑥2 𝜕𝑥𝑛 ⎥
⎢ ⋮ ⋮ ⋱ ⋮ ⎥
⎢ 𝜕2 𝑓 𝜕2 𝑓 𝜕2 𝑓 ⎥
⎣𝜕𝑥𝑛 𝜕𝑥1 𝜕𝑥𝑛 𝜕𝑥2 ⋯ 𝜕𝑥2𝑛 ⎦
(
If its domain 𝑆 is convex, and ∇ 𝑓(𝑥) is positive semidefinite for all 𝑥 ∈ 𝑆, then 𝑓 is a convex
function. That is, all eigenvalues of ∇( 𝑓(𝑥) are nonnegative.
If its domain 𝑆 is convex, and ∇( 𝑓(𝑥) is positive definite for all 𝑥 ∈ 𝑆, then 𝑓 is a strict
convex function. That is, all eigenvalues of ∇( 𝑓(𝑥) are positive
Quadratic forms
)
𝑓(𝑥) = ( 𝑥 * 𝐴𝑥 + 𝑏 * 𝑥 + 𝑐, with 𝐴 symmetric. ∇𝑓(𝑥) = 𝐴𝑥 + 𝑏 and ∇( 𝑓(𝑥) = 𝐴. A quadratic
form is convex ⟺ 𝐴 positive semidefinite is, and strictly convex ⟺ 𝐴 positive definite is
Local is global
For unconstrained convex problems: ‘local is global’
Convex Fermat theorem
Let 𝑓 be a convex, differentiable function. If there exists a 𝑥 ∗ such that ∇𝑓(𝑥 ∗ ) = 0, then 𝑥 ∗
is the global minimizer of 𝑓. The convex Fermat theorem gives the first order sufficient
conditions for unconstrained minimization
Shortcut 4-step method
You can assume there exists a minimum by skipping Weierstrass. Then, show that there is a
point that meets the sufficient conditions, by the convex Fermat theorem. If you can find
these points, you were right by assuming there exists a minimum
Subgradients
Let 𝑓 be a convex function on domain 𝑆, then 𝑟 is a subgradient of 𝑓 at point 𝑥 if
𝑓(𝑦) ≥ 𝑓(𝑥) + 𝑟 * (𝑦 − 𝑥) ∀𝑥, 𝑦 ∈ 𝑆. The gradient is a subgradient. The set of all
subgradients at point 𝑥 is called the subdifferential of 𝑓 at 𝑥, denoted by 𝜕𝑓(𝑥)
Convex Fermat theorem revisited
Let 𝑓 be a convex function. If there exists a 𝑥 ∗ such that 0 ∈ 𝜕𝑓(𝑥 ∗ ), then 𝑥 ∗ is the global
minimizer of 𝑓. This is the convex Fermat theorem generalized to non-differentiable
functions
Subdifferential calculus
Given convex functions 𝑓 and 𝑔:
- If 𝑓 is differentiable then 𝜕𝑓(𝑥) = ∇𝑓(𝑥)
- 𝜕(𝛼𝑓)(𝑥) = 𝛼𝜕𝑓(𝑥) for all 𝛼 ≥ 0
- 𝜕(𝑓 + 𝑔)(𝑥) = 𝜕𝑓(𝑥) + 𝜕𝑔(𝑥) (Minkowski sum)
- 𝜕N[|. |[O(0) = 𝐷" (unit ball in dimension 𝑛)
- 𝜕(max{𝑓, 𝑔})(𝑥) = 𝑐𝑜N𝜕𝑓(𝑥) ∪ 𝜕𝑔(𝑥)O if 𝑓(𝑥) = 𝑔(𝑥), 𝑐𝑜(. ) is the convex hull
Week 2
Bisection method
Method to find a local minimum of a continuously differentiable function 𝑓: ℝ → ℝ in
domain [𝐿, 𝑈], given that 𝑓 + (𝐿) < 0 en 𝑓 + (𝑈) > 0.
,-.
Determine midpoint 𝑀 = ( , evaluate 𝑓 + (𝑀). If 𝑓 + (𝑀) < 0, set 𝐿 = 𝑀, if 𝑓 + (𝑀) > 0, set
𝑈 = 𝑀, if 𝑓 + (𝑀) = 0, set 𝐿 = 𝑀 en 𝑈 = 𝑀. Repeat until range [𝐿, 𝑈] sufficiently small
The benefits of buying summaries with Stuvia:
Guaranteed quality through customer reviews
Stuvia customers have reviewed more than 700,000 summaries. This how you know that you are buying the best documents.
Quick and easy check-out
You can quickly pay through credit card or Stuvia-credit for the summaries. There is no membership needed.
Focus on what matters
Your fellow students write the study notes themselves, which is why the documents are always reliable and up-to-date. This ensures you quickly get to the core!
Frequently asked questions
What do I get when I buy this document?
You get a PDF, available immediately after your purchase. The purchased document is accessible anytime, anywhere and indefinitely through your profile.
Satisfaction guarantee: how does it work?
Our satisfaction guarantee ensures that you always find a study document that suits you well. You fill out a form, and our customer service team takes care of the rest.
Who am I buying these notes from?
Stuvia is a marketplace, so you are not buying this document from us, but from seller LeonVerweij. Stuvia facilitates payment to the seller.
Will I be stuck with a subscription?
No, you only buy these notes for $7.58. You're not tied to anything after your purchase.