This book about Riemannian optimization by Nicolas Boumal is published by Cambridge University Press, 2023.
You can also download the pre-publication PDF.
This website further offers recorded lectures (videos + slides) and exercises, as a companion to the book.
Feel free to e-mail me about any mistakes you spot or suspect, be it typos or more serious things. These are added as sticky notes in the pdf above (last updated on Sep. 15, 2023). I appreciate your input, always.
For a shorter video introduction, click the More tab.
About the book
Optimization on manifolds is the result of smooth geometry and optimization merging into one elegant modern framework. This text introduces the differential geometry and Riemannian geometry concepts to help students and researchers in applied mathematics, computer science and engineering gain a firm mathematical grounding to use these tools confidently in their research.
All definitions and theorems are motivated to build time-tested optimization algorithms. Starting from first principles, the text goes on to cover current research on topics including iteration complexity and geodesic convexity. Readers will appreciate the tricks of the trade sprinkled throughout the book, to guide research and numerical implementations.
This book has no prerequisites in geometry or optimization. Chapters 3 and 5 can serve as a standalone introduction to differential and Riemannian geometry, focused on embedded submanifolds of linear spaces, with proofs and an eye towards computability. It then builds from there to cover algorithms and equip the reader for modern research challenges.
Chapter 8 provides the general theory so that we can build quotient manifolds in Chapter 9. The optimization algorithms in Chapters 4 and 6 apply to the general case, but can already be understood after reading Chapters 3 and 5. Chapter 7 details examples of submanifolds that come up in practice. Chapter 10 covers more advanced Riemannian tools, and Chapter 11 introduces geodesic convexity.
Teaching
As a course, this material is popular with applied mathematicians, computer scientists and mathematically inclined engineering students, at the graduate and advanced undergraduate levels.
In a one-semester graduate course of the mathematics department at Princeton University in 2019 and 2020 (24 lectures of 80 minutes each, two projects, no exercises), I covered much of (what became) Chapters 1–6 and select parts of Chapter 7 before the midterm break, then much of Chapters 8–9 and select parts of Chapters 10–11 after the break. Those chapters were shorter at the time, but it still made for a sustained pace.
At EPFL in 2021, I discussed mostly Chapters 1–8 in 13 lectures of 90 minutes, plus as many exercise sessions and two projects. In 2023, the lectures were recorded: see Lectures tab above.
If you (will) teach this topic, feel free to e-mail me: I can share more resources.
How to cite
@Book{boumal2023intromanifolds,
title = {An
introduction to optimization on smooth manifolds},
author = {Boumal,
Nicolas},
publisher = {Cambridge University
Press},
year =
{2023},
url =
{https://www.nicolasboumal.net/book},
doi =
{10.1017/9781009166164}
}
It is best to reference sections, theorems, equations, etc., as all such objects are numbered identically in the pre-publication PDF available here and in the published version. In contrast, page numbers differ.
Table of contents (expand/collapse)
- Preface
- 1. Introduction
- 2. Simple examples
- 2.1 Sensor network localization from directions: an affine subspace
- 2.2 Single extreme eigenvalue or singular value: spheres
- 2.3 Dictionary learning: products of spheres
- 2.4 Principal component analysis: Stiefel and Grassmann
- 2.5 Synchronization of rotations: special orthogonal group
- 2.6 Low-rank matrix completion: fixed-rank manifold
- 2.7 Gaussian mixture models: positive definite matrices
- 2.8 Smooth semidefinite programs
- 3. Embedded geometry: first order
- 3.1 Reminders of Euclidean space
- 3.2 Embedded submanifolds of a linear space
- 3.3 Smooth maps on embedded submanifolds
- 3.4 The differential of a smooth map
- 3.5 Vector fields and the tangent bundle
- 3.6 Moving on a manifold: retractions
- 3.7 Riemannian manifolds and submanifolds
- 3.8 Riemannian gradients
- 3.9 Local frames*
- 3.10 Notes and references
- 4. First-order optimization algorithms
- 4.1 A first-order Taylor expansion on curves
- 4.2 First-order optimality conditions
- 4.3 Riemannian gradient descent
- 4.4 Regularity conditions and iteration complexity
- 4.5 Backtracking line-search
- 4.6 Local convergence*
- 4.7 Computing gradients*
- 4.8 Numerically checking a gradient*
- 4.9 Notes and references
- 5. Embedded geometry: second order
- 5.1 The case for another derivative of vector fields
- 5.2 Another look at differentials of vector fields in linear spaces
- 5.3 Differentiating vector fields on manifolds: connections
- 5.4 Riemannian connections
- 5.5 Riemannian Hessians
- 5.6 Connections as pointwise derivatives*
- 5.7 Differentiating vector fields on curves
- 5.8 Acceleration and geodesics
- 5.9 A second-order Taylor expansion on curves
- 5.10 Second-order retractions
- 5.11 Special case: Riemannian submanifolds*
- 5.12 Special case: metric projection retractions*
- 5.13 Notes and references
- 6. Second-order optimization algorithms
- 6.1 Second-order optimality conditions
- 6.2 Riemannian Newton's method
- 6.3 Computing Newton steps: conjugate gradients
- 6.4 Riemannian trust regions
- 6.5 The trust-region subproblem: truncated CG
- 6.6 Local convergence of RTR with tCG*
- 6.7 Simplified assumptions for RTR with tCG*
- 6.8 Numerically checking a Hessian*
- 6.9 Notes and references
- 7. Embedded submanifolds: examples
- 7.1 Euclidean spaces as manifolds
- 7.2 The unit sphere in a Euclidean space
- 7.3 The Stiefel manifold: orthonormal matrices
- 7.4 The orthogonal group and rotation matrices
- 7.5 Fixed-rank matrices
- 7.6 The hyperboloid model
- 7.7 Manifolds defined by $h(x) = 0$
- 7.8 Notes and references
- 8. General manifolds
- 8.1 A permissive definition
- 8.2 The atlas topology, and a final definition
- 8.3 Embedded submanifolds are manifolds
- 8.4 Tangent vectors and tangent spaces
- 8.5 Differentials of smooth maps
- 8.6 Tangent bundles and vector fields
- 8.7 Retractions and velocity of a curve
- 8.8 Coordinate vector fields as local frames
- 8.9 Riemannian metrics and gradients
- 8.10 Lie brackets as vector fields
- 8.11 Riemannian connections and Hessians
- 8.12 Covariant derivatives and geodesics
- 8.13 Taylor expansions and second-order retractions
- 8.14 Submanifolds embedded in manifolds
- 8.15 Notes and references
- 9. Quotient manifolds
- 9.1 A definition and a few facts
- 9.2 Quotient manifolds through group actions
- 9.3 Smooth maps to and from quotient manifolds
- 9.4 Tangent, vertical and horizontal spaces
- 9.5 Vector fields
- 9.6 Retractions
- 9.7 Riemannian quotient manifolds
- 9.8 Gradients
- 9.9 A word about Riemannian gradient descent
- 9.10 Connections
- 9.11 Hessians
- 9.12 A word about Riemannian Newton's method
- 9.13 Total space embedded in a linear space
- 9.14 Horizontal curves and covariant derivatives
- 9.15 Acceleration, geodesics and second-order retractions
- 9.16 Grassmann manifold: summary*
- 9.17 Notes and references
- 10. Additional tools
- 10.1 Distance, geodesics and completeness
- 10.2 Exponential and logarithmic maps
- 10.3 Parallel transport
- 10.4 Lipschitz conditions and Taylor expansions
- 10.5 Transporters
- 10.6 Finite difference approximation of the Hessian
- 10.7 Tensor fields and their covariant differentiation
- 10.8 Notes and references
- 11. Geodesic convexity
- 11.1 Convex sets and functions in linear spaces
- 11.2 Geodesically convex sets and functions
- 11.3 Alternative definitions of geodesically convex sets*
- 11.4 Differentiable geodesically convex functions
- 11.5 Geodesic strong convexity and Lipschitz continuous gradients
- 11.6 Example: positive reals and geometric programming
- 11.7 Example: positive definite matrices
- 11.8 Notes and references
- Bibliography