Mitigating the Billion Dollar Mistake
gingerbill.orgI'm sure Bill understands what I'm about to say, but as a person on team "require explicit initializations" I think the mitigations I would be looking at are:
1. Only require that the programmer prove the variable is initialized before it's read. Riffing on Bill's example:
Object *x; // Fine
if (is_foo) {
x = &foo;
} else {
x = &bar;
}
x->a // Fine, was initialized by the time it was used.
Of course this is still a trade-off, your compiler has to work harder to prove that all paths that flow to each use are definitely-initialized. When you have initialization functions you now need to have type specifiers like 'in', 'out', and 'in/out' so that 'out' can take a pointer to uninitialized data, or something like MaybeUninit. This handles this example from Bill: Foo f;
Bar b;
grab_data(&f, &b); // Fine if 'grab_data(out Foo *, out Bar *)'.
2. Something like Rust's MaybeUninit to explicitly opt-in to wanting to deal with possibly-uninitialized data. Obviously also a trade-off, especially if you want to force all the maybe uninitialized stuff to be in an 'unsafe' block.In the first case in a language with type inference, I'd argue it would probably be written a bit different. And probably something like the following two cases:
I know this is a simple example, but it's kind just a different way to think about the problem. Sometimes the default C way of having to specify the type first is a good enough of a nudge away to not do when you can just infer the type from the expression. Especially when this variable is being used not as a variable in itself but as an alias for another. The place where it is more likely to have a `nil` pointer due to no explicit initialization in Odin would be within a struct, rather than on its own on the stack.x := &bar if is_foo { x = &foo } // or x := &foo if is_foo else &barIn the second case, Odin's approach would probably just not even do that, but rather `f, b := grab_data()`. The use of `&` is when it needs to be an in-out parameter (i.e. read and write to), which means it has different functionality.
Through my own experience moving from IDE & higher-level languages to a simple textual editor and (nullable) systems languages, I've noticed that the way I read and write code is entirely different, and I can remember my "old eye." I think most people reading this view your "noise" as their signal. They get a good feeling when resolving tooling diagnostics (doubled if in an unfamiliar-to-them domain like systems programming.) It makes them feel secure. In that sense, I agree very much with the article: "I honestly think a lot of this discussion is fundamentally a misunderstanding of different perspectives rather than anything technical."
Personally, I've noticed that I pause less and write more now that I've got less tooling in my editor, and that's very enjoyable. I'd encourage anyone to try it. I have no autocomplete or "hover" info, just compilation errors presented aside the offending lines.