Possibly Fun C Code
yodaiken.comAbout "You are not expected to understand this:"
> So we tried to explain what was going on [in the listing]. "You are not expected to understand this" was intended as a remark in the spirit of "This won't be on the exam," rather than as an impudent challenge.
> The real problem is that we didn't understand what was going on either. The savu/retu mechanism for doing process exchange was fundamentally broken because it depended on switching to a previous stack frame and executing function return code in a different procedure from the one that saved the earlier state. This worked on the PDP-11 because its compiler always used the same context-save mechanism; with the Interdata compiler, the procedure return code differed depending on which registers were saved.
Similar trickery can be found with the RETF instruction in x86 processors. With the 8086, it was the fastest way to do an indirect far calls (in x86 jargon, a call that can address more than 64K of memory), since there's no way to do it using just registers. It pops the address from the stack, then the code segment/selector, and jumps to there. Simple, right?
This caused havoc in later processors when RETF got optimized. The new processors predict that RETF will always, well... return, and preload instructions based on that assumption. If it was used as a far call, it would reliabily ruin the instruction cache, making it actually quite a slow way to do far calls.
The interdata was a strange machine architecture - before the era where everything started to look the same. Via Wikipedia: https://books.google.com/books?id=pWBoOXVjuZ0C&dq=%227%2F32%...
> Turning on the so-called “optimizer” breaks this code because current C “optimizers” bizarrely enough can change the semantics of C code in often unpredictable ways without making any performance improvements. This is permitted by the C Standard for reasons that strike me as not well thought out, but let’s pass over this sad situation for the moment.
the C Standard has a thing called "undefined behavior", and this person is running right into it with their invalid pointer arithmetic. So the result is undefined, and could be anything, including embedding the Doom source code.
>could be anything
Why do people say this. I mean it may be technically correct, but not reasonable by any stretch. If I put a spelling error or miss a semi colon in my code it may be legal for the compiler to do anything but that isn't reasonable. We rightly expect compilers to pick up on and point out spelling errors why do we not expect the same for UD.
Or do you also tell programmers not to make any errors because the compiler doesn't have to warn you about even the most obvious ones?
If I overflow a signed integer, I expect that to be on me, and depending on the context I could check for overflow after the fact. That isn't a licence for the compiler to do what it damn well wants, and practically speaking it doesn't really matter anyway. What processors in the past 30 years have done anything funky on integer overflow or has weird signed integers?
C gives you enough rope to hang yourself. It isn't required for the compiler to tie the noose and pull the lever.
It's interesting how often people repeat this argument as if it were persuasive. The compiler developers and Standards writers made an engineering decision which is hard to justify so most people who support it don't even try.
If your C isn't arbitrarily running Doom at unpredictable times, are you even doing it right?