100% satisfaction guarantee Immediately available after payment Both online and in PDF No strings attached
logo-home
computational machine learning and operational $4.22   Add to cart

Case

computational machine learning and operational

 3 views  0 purchase
  • Course
  • Institution

computational machine learning and operational

Preview 1 out of 2  pages

  • September 20, 2024
  • 2
  • 2024/2025
  • Case
  • Best
  • B
  • Unknown
avatar-seller
Prof Dr Jörg Fliege Semester 2, 2022/2023
School of Mathematics
University of Southampton



MATH6184

Exercise Sheet 3 Sample Solutions


1. Both algorithms stop after one step at the vertex x = (3, 5)T of the set of feasible points.
At this point, the reduced gradient g is equal to the zero vector: g = (0, 0)T . (Not a
very interesting exercise, as I now realise. Please vary the starting point randomly and
observe what happens then. Computing two or three iterations is enough.)

2. (a)

Fr (x1 , x2 ) = (x1 − 5)2 − 2x1 x2 + 10(x2 − 10)2
+r max{0, −x1 }2 + r max{0, x1 − 3}2 + r max{0, −x2 }2 + r max{0, x2 − 5}2

(b) For x1 > 3 and x2 > 5, we have

Fr (x1 , x2 ) = (x1 − 5)2 − 2x1 x2 + 10(x2 − 10)2 + r(x1 − 3)2 + r(x2 − 5)2 ,

and computing the gradient ∇Fr (x1 , x2 ), we see that the necessary condition
∇Fr (x1 , x2 ) = (0, 0)T is a linear system of equations in the two unknowns x1 , x2 .
(This system contains r as a parameter.) Rearranging this system we get
1 5 + 3r
x1 = x2 + −→ 3,
1+r 1+r
1 100 + 5r
x2 = x1 + −→ 5,
10 + r 10 + r


which answers part (c). Solving for x1 , x2 leads to

100 + 5r + (5 + 3r)(10 + r)
x1 = ,
(1 + r)(10 + r) − 1
1 100 + 5r + (5 + 3r)(10 + r) 100 + 5r
x2 = + .
10 + r (1 + r)(10 + r) − 1 10 + r

3. (a)

Fr (x1 , x2 ) = 2(x1 − 3)2 − x1 x2 + (x2 − 5)2 + r max{0, x21 + x22 − 1}2
+r max{0, −x1 }2 + r max{0, x1 − 2}2 + r max{0, −x2 }2

(b) The function Fr is convex for r ≥ 0 (because it is a sum of four convex functions,
as some elementary calculus shows.
(c) The unconstrained minimum of the original objective is (2, −4), which is not feasi-
ble. The penalty functions added to the original objective will ”push” the minimum
towards the set of feasible points for r −→ +∞, but for finite r the minimum of Fr
will stay outside.
(d) Easily done with the Matlab files provided.

1

The benefits of buying summaries with Stuvia:

Guaranteed quality through customer reviews

Guaranteed quality through customer reviews

Stuvia customers have reviewed more than 700,000 summaries. This how you know that you are buying the best documents.

Quick and easy check-out

Quick and easy check-out

You can quickly pay through credit card or Stuvia-credit for the summaries. There is no membership needed.

Focus on what matters

Focus on what matters

Your fellow students write the study notes themselves, which is why the documents are always reliable and up-to-date. This ensures you quickly get to the core!

Frequently asked questions

What do I get when I buy this document?

You get a PDF, available immediately after your purchase. The purchased document is accessible anytime, anywhere and indefinitely through your profile.

Satisfaction guarantee: how does it work?

Our satisfaction guarantee ensures that you always find a study document that suits you well. You fill out a form, and our customer service team takes care of the rest.

Who am I buying these notes from?

Stuvia is a marketplace, so you are not buying this document from us, but from seller 1097434525U. Stuvia facilitates payment to the seller.

Will I be stuck with a subscription?

No, you only buy these notes for $4.22. You're not tied to anything after your purchase.

Can Stuvia be trusted?

4.6 stars on Google & Trustpilot (+1000 reviews)

85443 documents were sold in the last 30 days

Founded in 2010, the go-to place to buy study notes for 14 years now

Start selling
$4.22
  • (0)
  Add to cart