Settings

Theme

Yagrad – 100 SLOC autograd engine with complex numbers and fixed DAG

github.com

33 points by noway421 2 years ago · 4 comments

Reader

bionhoward 2 years ago

Elegant. I want to review this more. Could __slots__ work here? I always compulsively try that to save memory. Keep it up.

  • noway421OP 2 years ago

    Great idea!

    I'm testing it on a 3-layer perceptron, so memory is less of an issue, but __slots__ seems to speed up the training time by 5%! Pushed the implementation to a branch: https://github.com/noway/yagrad/blob/slots/train.py

    Unfortunately it extends the line count past 100 lines, so I'll keep it separate from `main`.

    I have my email address on my website (which is in my bio) - don't hesitate to reach out. Cheers!

spadufed 2 years ago

What are some common examples of complex numbers in these sorts of applications?

  • noway421OP 2 years ago

    Here complex numbers are used for an eloquent gradient calculation - you can express all sorts of operations through just the 3 functions: `exp`, `log` and `add` defined over complex plane. Simplifies the code!

    The added benefit is that all the variables become complex. As long as your loss is real-valued you should be able to backprop through your net and update the parameters.

    PyTorch docs mention that complex variables may be used "in audio and other fields": https://pytorch.org/docs/stable/notes/autograd.html#how-is-w...

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection