April, 2022 - François HU
Master of Science - EPITA
This lecture is available here: https://curiousml.github.io/
with $f: \mathbb{R}^n \to \mathbb{R}$, $g: \mathbb{R}^n \to \mathbb{R}^m$ and $h: \mathbb{R}^n \to \mathbb{R}^p$
where $\nabla f(x) = \left[\frac{\partial f(x)}{\partial x_1}, \frac{\partial f(x)}{\partial x_2}, \cdots, \frac{\partial f(x)}{\partial x_n}\right]$ is the gradient of $f$
For a given critical point $x^*$, if $H_f(x^*)$ is
Objective: Let us minimize the function $$ f(x) = 0.5 - xe^{-x^2} $$
We assume $f$ unimodal on $[a, b]$. Let $x_1, x_2\in[a, b]$ such that $x_1<x_2$
By evaluating and comparing $f(x_1)$ and $f(x_2)$ one of the intervals $]x_2, b]$ or $[a, x_1[$ can be removed. The minimum belongs to one of the remaining sub-intervals
The process can be iterated by calculating only one evaluation of the function each time
We want to reduce the search interval by the same factor at each iteration and moreover, we want to keep the same relations between the points of the new interval as with the old one
To do this, the relative positions of the two points are chosen by $\tau$ and $1-\tau$ with $\tau^2 = 1-\tau$. Therefore $\tau = \dfrac{\sqrt{5}-1}{2} \approx 0.618$ and $1-\tau \approx 0.382$
Whatever sub-interval is chosen, its length will be $\tau$ times that of the previous interval and the new points will be in position $\tau$ and $1-\tau$ relative to the new interval
The convergence rate is linear $O(n)$
Create a function golden_search(f, a, b, tol)
which returns the minimum of the function f
in the interval $[a, b]$ (for a given tolerance) using golden section search algorithm.
Minimize with golden_search
the function $f(x) = 0.5 - xe^{-x^2}$.
We can set $a = 0$, $b = 2$ and $tol = 0.001$
Find the minimum of the function $f(x) = 0.5 - xe^{-x^2}$ using newton method (you can choos 1 as a starting point).
Compare numerically the time complexity of these two methods.