Settings

Theme

Ask HN: Who had the crazy idea to make the stack grow down?

4 points by _e8at 3 months ago · 9 comments

Reader

ofalkaed 3 months ago

I don't think any single person did, they just went with what ever way best suited the hardware/language it is being implemented on/with and the needs of the language being implemented. If stack underflows are rare than growing down means testing for an overflow is always the same regardless of stack size and I assume this is why some languages (like Forth) have -1 as false instead of 0, -1 is an overflow so the languages own true/false can test for an overflow. In languages which are not so directly dealing with the stack, underflows are rare and may not even be possible so having the last element of the stack as stack pointer==0 simplifies things.

GianFabien 3 months ago

Back a long time ago, before GB memories and MMUs, the executable code was loaded at low addresses, statically allocated data followed, then dynamically allocated memory (heap). So stack was placed at the very top of memory and grew down. When heap and stack collided it signaled an out of memory situation.

  • bjourne 3 months ago

    To detect collisions you'd need write-protected memory. Afaik, most computers from the era you are thinking of did not have write-protected memory.

  • _e8atOP 3 months ago

    makes sense

sema4hacker 3 months ago

I've programmed in assemblers and higher level languages on a variety of machines over the decades, and I can't recall ever caring in which direction the stack grew, only if it under or overflowed.

bediger4000 3 months ago

The mostly forgotten HP-PA architecture, and whatever architecture Multics ran on had the stack growing up, and the heap in high memory.

_wire_ 3 months ago

If it grew up it'd be confused with the heap!

But srsly folks, memory with an origin of zero is a proud tradition that helps confused programmers know where to begin.

And given that within the Von Neumann architecture program and data can not be distinguished, and also noting the incredible utility of a stack to keeping a dynamic call chain with localized storage reference scope to support recursion, a paradigm that divides between a heap and stack in a layout that's as open-ended as possible to the available storage and execution demands of the program seems not only prudent but fairly obvious.

Sure, feel free to inject an arbitrarily complex N-leveled storage abstraction built from pure message passing between caches within some larger, wildly associative machinery and stuff it into the nether regions of the machine. But regardless of such hijinks, as long as your memory is indexed and locally finite, you end up with at least two ends of memory, hither and yon, so may as well use them.

As to turning hither and yon upside down into yon and hither, knock yourself out! Show the world the future of memory should be inverted and palindromic-- introducing Z, the runtime environment where everything can and does start from either end or anywhere in between. No design nor implementation is necessary. Every pattern in memory is a valid program. Just state your objective and start debugging. Voila! Problems that once seemed intractable are solved. Call it VibeZ coding.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection