Settings

Theme

Show HN: DiscoGrad – Automatically differentiate across branches in C++ programs

github.com

2 points by frankling_ 2 years ago · 0 comments · 1 min read

Reader

We just pushed a new-and-improved version of DiscoGrad, a tool to do automatic differentiation across C++ programs with parameter-dependent control flow ("if (f(x) < c) { ... }") and randomness. This is needed to accelerate solving various optimization problems involving simulations, where plain autodiff is often useless.

In essence, DiscoGrad applies an (LLVM-based) source-to-source transformation to your C++ program that adds some calls to our header library, which then handles the gradient calculation. The main difference to similar tools/estimators is that it's fully automatic (no need to come up with a differentiable problem formulation/reparametrization) and that the branch condition can be any function of the program inputs (no need to know upfront what distribution the condition follows).

We're currently a team of two working on DiscoGrad as part of a research project, so don't expect production-grade code quality, but we do intend for it to be more than a throwaway research prototype. Use cases we've successfully tested include calibrating simulation models of epidemics or evacuation scenarios via gradient descent, and combining simulations with neural networks.

We hope you find this interesting/useful and are happy to answer questions!

No comments yet.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection