*June, 2022 - François HU*

*Master of Science - EPITA*

*This lecture is available here: https://curiousml.github.io/*

**Generalities on optimization problems**- Notion of critical point
- Necessary and sufficient condition of optimality

**Unconstrained optimization in dimension $n=1$**- Golden section search
- Newton's method

**Unconstrained optimization in dimension $n\geq 2$**- Newton's method
- Gradient descent method
- Finite-difference method
- Cross-Entropy method

**Constrained optimization**- Equality constrained and Lagrange
- Inequality constrained Lagrange duality

**Some numerical methods in linear algebra**

- A
**dimensionality reduction**is transforming a data from a high-dimensional space into a low-dimensional space:- Avoir
**curse of dimensionality**: a sparse high-dimensional data is dangerous **Noise reduction**: there might be too much noise in high-dimensional data**Data visualization (2D or 3D visualization)**: cannot visualize > 3D

- Avoir

**PCA**can be used as a dimensionality reduction while keeping as much of the data's variation as possible.