Algorithms for Optimization [pdf]

algorithmsbook.com

343 points by Anon84 20 hours ago


klamike - 18 hours ago

Great to see optimization on the front page of HN! One thing I love about the book is it's full of really nice figures. If like me you love visualizations, you may enjoy this website I've been working on to visualize linear programming (LP) solvers: https://lpviz.net.

It's by no means polished, but it can be pretty fun to play around with, visualizing how the iterates of different LP algorithms (described in sections 11, 12 of the book) react to changes in the feasible region/objective, by just dragging the vertices/constraints around.

If you go to https://lpviz.net/?demo it will draw a polytope for you, and click around the interface to show off some of the features. I'm constantly chipping away at it in my free time, I welcome any feedback and suggestions!

cchianel - 15 hours ago

Some additional optimization resources (for metaheuristics, where you only have the objective/score function and no derivative):

- "Essentials of Metaheuristics" by Sean Luke https://cs.gmu.edu/~sean/book/metaheuristics/

- "Clever Algorithms" by Jason Brownlee https://cleveralgorithms.com/

Timefold uses the metaheuristic algorithms in these books (Tabu Search, Late Acceptance, Simulated Annealing, etc.) to find near-optimal solutions quickly from a score function (typically defined in a Java stream-like/SQL-like syntax so score calculation can be done incrementally to improve score calculation speed).

You can see simplified diagrams of these algorithms in action in Timefold's docs: https://docs.timefold.ai/timefold-solver/latest/optimization....

Disclosure: I work for Timefold.

kragen - 16 hours ago

This is a 521-page CC-licensed book on optimization which looks absolutely fantastic. It starts out with modern gradient-based algorithms rooted in automatic differentiation, including recent things like Adam, rather than the historically more important linear optimization algorithms like the simplex method (the 24-page chapter 12 covers linear optimization). There are a number of chapters on things I haven't even heard of, and, best of all, there are exercises.

I've been wanting something like this for a long time, and I regret not knowing about the first edition.

If you are wondering why this is a more interesting problem than, say, sorting a list, the answer is that optimization algorithms are attempts at the ideal of a fully general problem solver. Instead of writing a program to solve the problem, you write a program to recognize what a solution would look like, which is often much easier, for example with a labeled dataset. Then you apply the optimization algorithm on your program. And that is how current AI is being done, with automatic differentiation and variants of Adam, but there are many other algorithms for optimization which may be better alternatives in some circumstances.

crystal_revenge - 19 hours ago

This book as well as Kochenderfer's earlier book "Decision Making Under Uncertainty"[0] are some of my favorite technical books (and I wouldn't be surprised to find his newest book, "Algorithms for Decision Making", also fell into this category).

The algorithm descriptions are clear, the visualizations are great, and, as someone who does a lot of ML work, they cover a lot of (important) topics beyond just what is covered in your standard ML book. This is especially refreshing if you're looking for thinking around optimization that is not just gradient descent (which has been basically the only mainstream approach to optimization in ML for two decades now).

I've known a few people to complain that the code examples are in Julia, but, as someone who doesn't know Julia, if you have experience doing quantitative programming at all it should be trivial convert the Julia examples to your favorite language for implementation (and frankly, I'm a bit horrified that so many people "smart" people interested in these sorts of topics seem trapped into reasoning in one specific language).

Optimization is such a rich field and should be of interest to any computer scientist who would describe themselves as "interested in solving hard problems" rather than just applying a well known technique to a specific class of hard problems.

0. https://web.stanford.edu/group/sisl/public/dmu.pdf

rhines - 5 hours ago

Thanks for sharing this, looks like a useful overview of the field. I wish the first edition had come out a year or two earlier - this would have been a great resource for my undergrad research work. Back then there were no books covering CMA-ES, surrogate models, and gaussian processes all in one book, everything was scattered across different books and papers, with varying levels of technical depth and differing notations.

marmalade2413 - 12 hours ago

An absolutely fantast book. I've read it cover to cover a couple of times during my PhD (which focused on neural networks and numerical solvers). Gives the right amount of depth to serve as a detailed introduction whilst covering a lot of the key areas in optimisation. I still use this book as a reference years later.

fertom00 - 4 hours ago

I'm astonished and also disappointed to see that the book has dedicated sections for the metaheuristics Firefly and Cuckoo Search. I don't know what happened there, but any experienced researcher from the field knows that these metaheuristics (among several others) are not serious and are very criticized in the community.

There is even a paper in ITOR about this: https://onlinelibrary.wiley.com/doi/abs/10.1111/itor.12001

Unfortunately, there are some "research bubbles" in the community of people that only work with these shady metaheuristics and keep citing each other. This is getting so bad to the point that I still frequently see in conferences people using such metaheuristics and believing that they are ok, only to get criticized by someone in the audience later.

jonbaer - 8 hours ago

Thanks for post/link, excellent chapter on multiobjective optimization, any other books/material with that focus?

sfpotter - 18 hours ago

Can anyone provide a comparison of this book to Nocedal and Wright's book?