Show HN: Did you know you can multiply matrices without multiplication?
TL;DR: This idea implemented in Python https://gist.github.com/danbst/fdf604ae279f9e01c8a28f1f84f9876e
Hello. Long time ago I have learned that
log(xy) = log(x) + log(y)
and that this was used by Napier to multiply numbers. Also I knew that matrix multiplication time is dominated by inner dot products, which use addition and multiplication.And it just stuck me today, that it is possible to multiply matrices in logarithmic space (that is, matrix values are logs of real values) without multiplication! By using same trick Napier did.
(There are restrictions, as negative values produce complex logarithms. So it has to be handled carefully) This can (obviously) be done to replace any multiplication. The problem is that 2 logs, an add, and an exponentiation are a lot slower than a single multiply. The idea is to store all matrix weights for neural model in log space, and never leave it. Maybe create a new activation function which doesn't leave log space. You can check the code, I don't just replace multiplication, I perform addition (with tricks) in log space Could it (ML learning) all be done in "log-space"? Ie never need to pay the cost of the initial transformation and return?