Generalized Orders of Magnitude (GOOMs)
github.comNot quite clear to me what's being generalized here: the number of bits used for exponent and mantissa, something like that?
> GOOMs generalize the concept of "order of magnitude" to incorporate complex numbers that exponentiate to real ones. As with ordinary orders of magnitude, GOOMs are more stable than the real numbers to which they exponentiate.
> This implementation enables you to operate on real numbers far beyond the limits of conventional floating-point formats, for effortless scaling and parallelization of high-dynamic-range computations. You no longer need to scale, clip, or stabilize values to keep magnitudes within those limits.
Yes, I read all that, but it doesn't communicate clearly. In particular, I don't think they actually mean "complex number" in the sense it's ordinarily understood in mathematics, because in general exponentiating a complex number gives a complex result, even with a real (or even integer) exponent.
The paper formally defines GOOMs as the subset of the complex plane that elementwise exponentiates to the real line.
We'll update the README to make that clearer.
Thank you for pointing that out!