Why Hypercard Had to Die
loper-os.orgThis doesn't seem to be anything special about Hypercard.
Hypercard was the BASIC of the Macintosh in that it served the same niche. It certainly was not BASIC, but the idea was the same, a little language and environment that fit into the machine's environment that allowed novices (and then highly practiced novices :-)) to create stuff on the machine for other people to use.
It was killed for the same reason BASIC, in the sense of built-in 8bit micro BASIC, kind of fell to the wayside except as a way to stitch existing apps together. No one is really impressed by a calculator program anymore. Matching the capabilities of modern software is hard nowadays. It's demotivating to create a calculator program... that's it? Hypercard really can't match even JS, et al. in functionality.
I remember when it started to happen in the 16bit era. Software was already starting to get too good and too complex for a simple tool. I got an Amiga. The Amiga had a built-in BASIC -- but no one used it. Why? Because there was no fucking way you were going to recreate even the bouncing ball demo in BASIC. It wasn't going to happen. Essentially, anything you created was going to be a huge disappointment with that tool. This was not necessarily true in the 8bit days -- but from 16bit on, you either learned assembly/C or gave up. People can learn in that environment (and did), but the barrier was much higher.
Lowering that barrier while coming close to the capability of "real" software is a hard problem. Fortunately, it's coming full circle. Now that machines are powerful enough that "real" software is being written in interpreters, JS seems like it actually is the new BASIC.
I'm not totally sold on that -- because the stack is too too baroque (although, again, people do learn in that environment). Processing is perilously close but not quite there because it's so domain specific. Python comes with a lot, but doesn't fit into native environments very well. If I had to chose, I'd say it's going to be something else over the JS/CSS/HTML stack in the same way 8bit BASIC sat over the primitive OS at the time.
I wished Hypercard was the BASIC of Macintosh. However, when I tried to build something with it (a manager for the offline game Car Wars), I quickly ran into some trivial obstacle, and was told that I'd have to write a C extension. (iirc, it was the lack of a random number generator.) Enough of that.
Hypercard was mostly useful as a sort of personal wiki or address card book; I used it quite a bit as a notepad with hyperlinks. It didn't die for any real dramatic reasons. It was programmatically stuck in the 9" B&W Mac era and never quite fit onto larger color screens, so once the web came around, Apple dropped it.
Also you history is way off, because BASIC morphed into VisualBasic, which did everything Hypercard could and ten times more, and was the leading programming environment for a generation.
Similarly, you had to do a lot of peeks and pokes in most 8bit BASICs to really get at the machine. But you could still do something reasonably worthwhile. I don't want to imply that you could do everything. The main idea is simply that you could do quite a bit, and it really did look and feel decent compared to apps of the day. As the systems got more advanced, it became harder or perhaps just less of a priority to empower your average novice.
I've never thought of VB being for an ordinary joe in the same way builtin BASICs were -- it was more of a macro language for Office, and a platform specific COBOL, not an entry-level environment. But I don't have enough direct experience -- perhaps I'm wrong about that...
It's funny you mention a personal wiki. I think the first thing I used Hypercard for was to create a catalog of my comic book collection.
(All of the work that went into it was lost due to an accident, which taught me a hard lesson about the need for backups.)
> This doesn't seem to be anything special about Hypercard.
Tell that to the thousands of otherwise non-programming people who developed real, useful applications (some of them best-selling) in Hypercard.
> it's going to be something else over the JS/CSS/HTML stack
Like building a castle on a swamp. Foundations matter: http://www.loper-os.org/?p=55
I agree. The web stack is a castle on a swamp, except that swamp is itself built above a hyperspace junction linking everything in the world together.
The web surpassed HyperCard from day one, simply by locating its resources on the network. And that's why people put up with it, even if we have to dive into a fetid quagmire every time we use it.
HyperCard versus the web is a classic example of Worse is Better.
Is it not programming because the syntax is verbose? I don't know anything about hypercard, but from the example in the OP it just looks like a limited and clunky IDE with a toy programming language. Just enough to build things that are just about good enough for people with limited goals, but then again there are thousands of products like that out there.
The whole paranoid 'everybody conspires against it because it's so great' is bullshit. If it's so great, why hasn't anyone make a clone and got rich of it? It's quite obvious why: because everybody who uses it runs into its limitations very quickly, so clones add extensions, and after a few iterations it becomes too difficult for beginners. And then the next clone stands up, lather rinse repeat.
That rant is was quite nice... had me going for a minute.
But I'm sure it is one of many in a line of "reasons the software crisis didn't have to happen" - riiight. The "mythical man month"? It's 'cause they didn't do it right in 1960/1970/1980/1990/2001/... If only "they" would learn...
So doing "it" right is a physical impossibility, then? Care to say why?
Because 'right' is in the eye of the beholder? It's subjective.
We agree much more on what's "right" than you suggest.
Take two systems which do the same things. One does it as expected, the other surprises you in subtle ways. One does it as specified, the other sometimes does not. One does it quickly, the other makes you wait a bit. One fits in a few pages of code, the other takes a whole book.
I agree that you can't tell what's right in advance. But it's not a matter of preference, it's a matter of ignorance. In hindsight, when you see the results, you can most of the time point out what could have produced better results, if only you knew. You can even go meta, wondering why you didn't knew, then try and change that in the future.
I upvoted you because I think this is a conversation that should be as civil as possible. My original tone was probably a bit too sarcastic.
On the subject of "doing it right" - every software engineer or maybe most passionate software engineers, have an ability to look at a spec or a piece of code and see how it is "done right". I certainly have my conception, my agenda, of what the right way to code and good software engineering is.
The thing is that this comes after fifty years of efforts to "do it right" failing in the sense that we don't have a single language we're satisfied with, we don't have a single operating system people unambiguously call good etc. The edifice of modern computing seems to lack a sound foundation.
The grizzled software engineer often knows this as a fact without caring about why and indeed the why is obscure.
Oddly enough, I think can illustrate my explanation for "why" by noting the common complain that programmers are "constantly reinventing the wheel". Now, if we look at the automotive engineers who build cars, we will note they too are "constantly reinventing the wheel" (literally now). Yet no one complains that the automotive engineer must create a new wheel for a new car with different mechanical properties than the old car which had the old wheel. Here we can see problem and the solution ("reinventing the wheel") does not seem intuitively to we-humans as a problematic state of affairs.
Looked at this way, initial criticism of a software engineer "reinventing the wheel" is a bit ridiculous. It seem logic that an engineer needs to change the components whenever they are putting together a different system for a different purpose. Moreover, the variety of distinct circumstances a software system needs to be engineered for is vast - it faces far more meaningful-context-changes than a car. In this perspective, it is absurd to expect the
Yet, this intuitively, with our human intuition, just doesn't seem right. I would claim that the intuition we have involve a natural conflating of something like "the idea of a wheel" with "the software implement of a wheel". A software implementation of a wheel or anything is more nebulous than a physical wheel but it is still not "the idea of a wheel". But as we "naturally" conflate these two item is easy for us to believe that we only need to think up the concept of an entity and we will have captured the thing. And this natural conflation of the ideas is perhaps where things go wrong... Where we get off expecting "general purpose operating system" to satisfy the gazillion purposes assigned to it, etc.
You're completely missing my point. I'm actually sort of agreeing with you -- but I'm also saying you don't really need (want?) to duplicate Hypercard.
Let's just say that considerably more than "thousands" have written BASIC programs that were real useful applications in their day. I would not argue that people have not done the same in Hypercard.
There's nothing special about hypercard. What's special is having an easy-to-use beginner programming environment that you could create fairly good small programs in -- in comparison to the commercial offerings of the day. BASIC was that, Hypercard was that, but that only partially exists today as the web stack because of it's baroque nature.
"Foundations matter" is really a tautology. I don't think the problem is the foundations. We build on sand all the time. Certainly without a foundation, you're fucked, but you place too much importance on it. Saying something like "Foundations matter" is pretending to be profound without substance. I might just as well say that "Input devices matter" -- and they do -- but how much do they matter? Is everything fucked because we all use mice & keyboards?
> "Foundations matter" is pretending to be profound without substance
Please take the time to read my article. The kind of pieces you get when something breaks or needs reworking does matter.
> Is everything fucked because we all use mice & keyboards?
I do not myself believe this, but there are those who do, and their arguments are worth paying attention to.
Yes, I read your article. I think I understand where you're coming from, but it's completely circular. All you need to do is create another mostly impermeable strata. People do this all the time, and it has been wildly successful so far.
In that sense, original foundations don't matter. But, of course, they do, 'cause you had to build a new strata, right? Round and round...
And, in the passing of time, you can simply replace the underlying foundations with something more unified. This, again, happens all the time. CPUs gain SIMD instructions and new addressing modes, VM support, etc. I fully expect, while nothing like DESCRIPTOR, more hardware support for higher level GC to come if its worth is proven.
The input device remark was probably a poor analogy, but it is somewhat similar. "Input devices matter". Well, duh, yeah. But it certainly is not a good summary of an argument to replace the mouse and keyboard with an generalized gesture recognition device. :-) The mouse was an addition to the keyboard, not a reconstruction of it. If we find that, say, speech is better at some things, we don't propose to take away everything else. We build on top, because, while it may offend the sensibilities of some -- you can fix the warts later.
You're right to say that we're all working on huge 8bit micros to an extent -- but beyond the historical implications, it's circular to think that really matters. It's why Alan Kay now sounds like a grumpy old man, as Raskin does, as most people who moan about lisp machines and the DESCRIPTOR architecture. Today, I can run emulations of these things far, far faster than the originals and the software on my box is far more capable than the software people were using then. The real world favors cutting away/polishing a turd into a smooth stone, not piling up someone's perfect diamonds -- because restarting from scratch every time you discover something new is wasted work.
You are lauding HyperCard and calling JS/CSS/HTML a swamp? Seriously?
JS/CSS/HTML all have flaws -- some more vexing than others -- but they (and the browsers that put them to use) are infinitely superior to HyperCard.
HyperCard was one of the most influential programs in history. Consider it was released in 1987 and ran in 1MB of RAM. It had four major impacts:
1) It was a concrete implementation of the idea of hypertext that actually worked. The fact that comments in HTML begin with <!-- is a tiny little ode to HyperTalk.
2) It was the first graphical IDE that I know of. (Clunky? At the time -- 1987 -- it was glorious.)
3) It was one of the easiest programming languages to pick up, if not the easiest, and yet it scaled to become quite powerful. (There was eventually a native compiler, itself written in HyperTalk, that could even create INITs.) HyperTalk's ease-of-use led to blind alleys (AppleScript tried to one-up HyperTalk and ended up being "read only").
4) It was extensible via plugins.
It was also an incredibly productive programming tool. In addition to allowing novices to code, it let me -- for example -- implement an RDMS engine in an evening, and build a database application (including reporting functions) in a second evening. It allowed the Millers to create Myst (they used a couple of plugins, one to display color images).
But HyperCard was written by a lone genius (Bill Atkinson) with weird quirks and flaws and it was impossible to maintain or improve. Its notable flaws included:
a) Weird code-base. It never really got a version 2.0. This is probably its single biggest flaw since all the others are things you'd have expected to be fixed in later versions. (Yes I know there was a version 2.0, but aside from a new debugger and plugin interface it wasn't a big improvement and it was slow coming.)
b) No native support for images. Visual Basic would address this by allowing you to treat images as just another kind of variable and manipulate them directly.
c) No native support for color, and 1-bit graphics were built into it in a very hard-to-fix way. Indeed, HyperCard was rewritten from scratch for the Apple IIgs and worked fine in color, but the Mac version never did.
d) No support for native controls. All of HyperCard's controls were faked and looked wrong. (And this flaw was faithfully copied by HyperCard's many imitators -- such as SuperCard, Runtime Revolution (still going!), and Assymmetrix Toolbook.) Again, VB addressed this.
e) No ability to create true standalone applications. Once again, VB addressed this.
HyperCard wasn't killed by Steve Jobs. It withered on the vine and Steve Jobs simply took it off life support. As for the other stuff he killed -- yeah, some of the dead-end Lisp-based projects that hadn't already been killed. No conspiracy -- HyperCard just wasn't fixable and by 1997 it didn't matter any more.
Several HyperCard clones went on to be pretty successful. Macromedia Director came into its own when it copied HyperTalk (which became Lingo). Visual Basic was in essence an improved HyperCard but with a crappy language. (In every head-to-head test I tried between VB3 and HyperCard, HyperCard hugely outperformed VB3, which if you know anything about HyperTalk is pretty sad.) And then of course there's the whole web thing.
SuperCard is one clone that picked up where HyperCard left off, with the extremely similar SuperTalk language. It fixed a lot of Hypercard's issues, like built in color support, access to native UI widgets, multiple windows, and the ability to create standalone applications. Updates added both OS X and Intel compatibility.
But it's definitely a niche product, which we've seen isn't something Apple is interested in. Supercard.us is down right now, so I'm not sure whether it's on the market or not. As Apple has discovered, the vast majority of computer users aren't interested in creating software; they're content consumers. SuperCard probably got as much use for making quick mockups before building a "real" application as it did by amateur developers.
One thing that I'll give HyperCard is that it made it easy for me to play around with programming while I was in elementary school. Today's programming tools are largely not that accessible.
Good of you to mention SuperCard which was, in many ways, the obvious successor to HyperCard (and it was revived a few years back and died owing to lack of interest). These days Runtime Revolution fills that niche. SuperCard was kind of flaky (I developed some stuff with it) and, more importantly, took an IDE -> shipping app model (where the dev environment was pretty much split off from the standalone app). This made it more useful for "real programmers" but less accessible for tinkerers.
It's probably worth mentioning that HyperCard was incredibly stable. You could work in it for months on end without crashing or losing any work. That alone was pretty staggering for the time.
Nothing special, other than the fact that you could program a game like Myst in it.
> other than the fact that you could program a game like Myst in it.
As long as you knew C and could write the Hypercard extensions needed to make Myst actually work, of course.
You could program a game like Myst in almost anything -- even BASIC with line numbers.
Edit: With the same amount of effort.
Why not with a magnetized needle, too?
Sigh. Maybe instead of ranting at your "typical software engineer", the author should spend a moment to consider that maybe it "could be built again" - but it never has.
So the question then turns to, why has it never been built? Is it maybe because Hypercard (and other visual systems) become entirely unmaintainable once we get to large-scale systems? And maybe it is because most people don't want a trivialized programming environment? They either want a full system, or just a product?
> They either want a full system, or just a product.
While I think this is true at the moment, it seems that there is an interesting question behind that, as well. Why don't more people want to create their own software? I do think the author has an implicit point: because, frankly, creating software stinks. Jon Skeet's talk [1] demonstrates this admirably. I spend a shockingly large part of my time working around bugs and leaky abstractions in other software rather than implementing my own ideas.
I don't know if it is possible to design a system with a solid enough abstraction that these problems don't exist. I do know that HyperCard came unusually close, as I know several folks who made HyperCard stacks who wouldn't imagine creating software in any typical fashion.
I hold out hope that such a useful system could be created again, one that gives people enough flexibility to create software solutions of their own (perhaps within a genre of software) and that is able to combat the currently-accurate stigma of programming being "hard".
[1] http://msmvps.com/blogs/jon_skeet/archive/2009/11/02/omg-pon...
Lego Mindstorms (http://mindstorms.lego.com/) is kind of like Hypercard for the modern age. It's got interesting applications (robotics and automation), and an easy-to-learn programming environment. I'd say it's the perfect product for a budding geek to cut her teeth on.
Hm. I've never seen a HyperCard project that was both maintainable and of a large-ish size. I think _that_ is where the HyperCard idea broke down - navigating through a bunch of cards is significantly more difficult than through a bunch of text files.
Granted, that _may_ be a question of the tools. But I do find it telling that in the long time since the "death" of HyperCard nobody could come up with compelling tools.
And creating software is hard because ultimately, it requires analytical thinking. Which, by itself, is hard. Yes, the choice of tool modulates the hardness of the problem - but the underlying issues are still hard. (Note: I do not claim non-programmers _can't_ reason analytically. I claim the effort/result ratio is not right for them)
It's the same reason most people buy furniture instead of building it. Acquiring the necessary skill set is simply too much effort for the result.
I definitely agree that software desire requires analytical thinking. (It certainly put a halt in my thoughts about a mass-appeal software builder: if they don't "get" [or want to get, or whatever] Excel, they don't have a chance for much else.)
I do think, though, that not being able to make projects of large-ish size isn't necessarily important for a wide variety of use cases. Most of the best apps I know of do only a very small thing—but do it very well. I think HyperCard allow people to do small things exactly the way they wanted. Which maybe wasn't very well, but was good enough for their needs and better than the alternatives.
As I mentioned in my other comment [1], I wonder if the parallel is somewhat like the UNIX command line. You probably aren't going to write a MySQL competitor in bash. But there are a whole class of small custom tools that you will write.
To perhaps restate my original point (and this is where perhaps we agree), the abstractions of HyperCard eventually didn't (couldn't?) adapt. Perhaps, as you said, because of cards. As someone else commented, also partially because of the codebase, but also, I think, because things like color and networking and a greater number of standard UI widgets all had to be added.
Spreadsheets are arguably a "trivialized programming environment".
Which (sort of) supports GP's argument. We have programming environments on one end, full products on another and spreadsheets (maybe also MS Access) in the middle. The environment looks crowded.
Btw, I'd love spreadsheets-as-apps replaced with something more maintainable but they also have a huge advantage: you can start using a spreadsheet with zero programming. Unless you copy that, you can't compete for that niche.
If we have room at the ends (for new programming environments, and new products), why can't we have room in the middle?
Something else with zero programming is online form builders; some already have constraints and reporting systems. If they added simple programming constructs, it could be as powerful as you like. (e.g. google forms enables you to skip to different questions based on previous answers, though it's getting away from easy visualization, unlike spreadsheets). You can view them as being a database, derived from input forms - what's stopping them growing towards unmaintainable real databases, in the way that spreadsheets can grow into unmaintainable apps?
A way to think about this is not in terms of a complex thing, but in terms of a simple thing... a toy... to which you can add-on complexity. e.g. I'm not sure that the first spreadsheets were fully programmable, but just related different cells by formula. And I know for sure that the first RDBs didn't have stored procedures.
Maybe, this could be done for many webapps: wiki + scripts; reddit + scripts (what would that be?); youtube + scripts (we have it a little bit with links on parts of the video). The puzzle is imagining what could possibly be the use of these... it helps if you already have a manual version that you are automatic (that's what the visicalc guys did: "spreadsheets" existed, on paper, before them: http://inventors.about.com/library/weekly/aa010199.htm).
There's visual webapps for assembling webservices, but they don't seem to have taken off. Maybe the "problem" is that they scale - the thing you create is good for more than just one person, unlike most spreadsheet documents, which (excluding templates) contains your own specific data - often, proprietary business data. Because it can scale, people invest more effort in it, and are happy to use proper APIs etc. And then, consumers don't have an unsatisfied need, because it's already done. Therefore, perhaps the "toy" needs to be about a particular business or person (e.g. be about their data).
Also: Arguably, Flash authoring tools took the place of hypercard.
I think we agree - my point was not that there is no room in the middle but that if you want to go there you need to give some zero-programming benefits for starters.
And yes, forms are an example of what this benefit might be.
Yes, I think your zero-programming point was spot on (though check that spreadsheet link: I was surprised to learn that visicalc automated existing paper spreadsheets, so "programming" wasn't even an issue, at least for its initial massive success).
But note that we're not really in "support of GP's argument" anymore. ;-) That's what I was arguing against (but excellently provocative questions BTW).
Point well taken. So the question then becomes, why only one (or a small number of) such environments? (Also, I consider spreadsheets hardly trivial. I marvel at what some people can do in Excel ;)
And my answer to my own question is still the one I previously implied: There is no market for this kind of environment.
Consider, your problem needs to be:
* Repetitive enough to benefit from automation
* Complicated enough that automating it would yield a significant gain in the long run.
* Simple enough to not require going to an actual full-blown programming environment.
* Not numerically solvable, since spreadsheets have that covered.
* Not solvable via macro-recording or workflow solutions, since that market is also well-covered.
I'd say that does leave a fairly small sector. Hence, low demand for Hypercard-like solutions.
I had the same reaction of "no market" from the article, but on reflection I suggest turn it around: instead of thinking of it as "a programming environment", think of it as a specific application first, which has customization added to it, and eventually becomes programmable. Seeing it this way, such environments are very common - they are just not presented as "programming environments". And that may be what is different about Hypercard - how it it thought of, not what it actually was.
- I think macro-recording etc counts as a trivialized programming environment
- Many standard desktop applications are programmable (though today point, they just throw in a pre-existing language) - Word processors (Word uses VB, .net; OpenOffice uses Java/Python; Adobe acrobat uses Javascript. I think you'll find in earlier word-processors (for eg), they had more limited programming environments, before that, not fully programmable scripting; before that, only macros, before that only specific and limited explicit configuration... and before that, no customization at all.
- Flash authoring tools can do what Hypercard did, and more; they are also very easy to use for simple things, and even do it in a similar way to hypercard (but include a full programming environment for more complex things, like spreadsheets do). NOTE: flash's programming abilities have been enhanced tremendously over the years. I don't know the origins, but I wouldn't be surprised if it initially didn't have a "programming" capability, but just customizing some aspects of playing movies.
- html + javascript itself is not that different from hypercard, especially if you include an "authoring tool", of which a great many exist.
- wiki's are also hyperlink based...
- online forms (wufoo, google forms) - they derive a database from forms (like ORM, but skipping the objects, to be "FRM"), and include constraints, different paths, and reporting tools.
- vim has a programming language, but it's about as pretty as programming with a spreadsheet. This is because it evolved from something simpler, to get specific tasks done - I don't think vi was programmable! certainly not in the earliest versions (emacs is the exception, beginning with an actual programming language). I'm not even sure that shells were initially programmable for that matter... and that's about as close to a programmer as a tool can get.
[Also please check my follow-up comments, esp: visicalc was not initially about programming.]
---
TL;DR I put it to you that any tool can have automation added on to it - and if we view it this way, "trivialized (or specialized) programming environments" are the norm, not the exception. Their goal is not "programming", but to solve the specific problems and meet the specific needs of their users. But the natural direction of a customization/configuration is programmability (and then to add programming features to help manage complexity, e.g. libraries, namespaces etc). It might take a while to get there. Some instances we see are only part-way there (see above); some stop growing/die before becoming attaining that level of customizability.
Going back to flash, ActionScript 3 added optional static types, enabling dramatically faster performance; before then, it was a version of Javascript; before that, it was a kind of hacked together script thing. I don't know what they had way back in Flash 1.0, but I suspect it wasn't yet programmable... just as visicalc 1.0 wasn't programmable...
If you look at it as purely customization, yes, there's a thriving market. But the author is specifically looking for something that creates a world 'where the distinction between the “use” and “programming” of a computer has been weakened'.
And the point is that those are fundamentally different activities. There's only a very small intersection that needs the simplicity of "use" and the complexity of "programming". And given that's an incredibly hard balance to strike, I stand by "no market" - at least for a product _specifically_ aimed there.
aka "operation" vs "design". I agree, for targeting that explicitly as a pedagogical or philosophical end in itself. A cool ideal, btw (though I think it works better with less powerful programming - more concrete, perhaps only regular or context free not turing complete).
In real (non-coding) life, there's usually overlap, of "adjustments". e.g. you're cutting tomatoes with a knife, and adjust the knife's angle, or your grip, or move the tomato, spin it on a vertical axis, rotate it, try sawing vs slicing, maybe change knives, etc. Perhaps you exclaim "This is the best knife for tomatoes!" and resolve to use only it henceforth. But many of these adjustments are unconscious, and part of everything we do. Is it "operation" or "design"? I think it's fuzzy in practice; we often chip away at things as we learn and adapt.
True, in software, there's usually a sharp line between user (operating) and programmer (designing). Customization crosses that line: macros, templates - even, hiding menus you don't use. Is hiding a menu "programming"? While not turing complete, it's a step closer to it.
I agree there's not much market for the combination, in itself. Programming is so accessible these days, if you want to do it, you just do it. Probably starting with HTML "programming", then Javascript or PHP. It even looks like a real website! It's like hypercard, but global.
I recently discovered Quartz Composer, which is included on any Mac w/ Xcode, and I was pretty impressed by its capability. IMHO it is one of the best kept secrets of the apple-verse.
While it is very different from Hypercard, it certainly provides a level of artistic expression and creative freedom that the article alludes is present in Hypercard.
When I booted up QC for the first time a few months ago, I was blown away by its awesomeness.
And then when I further explored the vibrant community out there AND the fact that it can be readily integrated into objective-c / cocoa, it's clear that Apple is very much in tune with what normal people and creative people need to express themselves on a computer. To get started, here are some links: http://en.wikipedia.org/wiki/Quartz_Composer and http://developer.apple.com/technologies/mac/graphics-and-ani... Hint: If you have Xcode installed, simply type "quartz composer" into spotlight and begin your journey.
A final note: I seem to recall but not certain that a) QC was an acquisition by Apple of a french company; and b) the original developer has moved on from Apple
I was sad to see Hypercard abandoned - I learned how to program with it myself (along with Perl & CGI apps - syntax didn't matter so much as what you could accomplish with it). But part of the reason I think is that Steve Jobs abandoned going primarily after the education market when he returned to Apple - he had already been burned by it (or become disillusioned by it) at Next, but also really Apple had already saturated the education market. He and Apple focused on the consumer market (imacs, ipods, etc.) - the business market had already been won by Microsoft. Microsoft also already had a more successful end-user friendly programming/scripting environment: Visual Basic and VBA. In case some don't know, Visual Basic was the most popular programming language in the world until overtaken by Java. VB didn't really start to significantly decline until more people developing business apps switched to C# and other languages.
Microsoft also won with a more popular tool for creating visual 'stacks' or presentations: Powerpoint. Most educators were basically using hypercard/supercard/hyperstudio to create presentations - and Powerpoint made it easier to do so.
Hypercard (and hypertalk) isn't dead though. Applescript is still around (despite Apple also trying to kill it). LiveCode and other options are still around and being updated for HTML5 & mobile platforms, as is Visual Basic, too (nsbasic). A java port of the hypertalk language is here: http://code.google.com/p/openxion/
Natural language-like interfaces aren't dead, either. Look at testing tools like Cucumber, the google search engine, Siri, and so forth. Look at all the DSLs out there that try to make Ruby/Javascript/etc. more like natural languages, at least in certain contexts.
HyperCard is very near and dear to my heart. When I was a kid I taught myself how to program with Hypercard. I'm a software developer today, and despite the crazy verbose syntax of HyperTalk, I still acquired an intuition for programming that remains helpful, 15 years later.
I actually approached Steve Jobs about the demise of Hypercard in 1998, when I was 15, at the Seybold SF conference. IIRC he gave a pretty dismissive response about it, basically saying that there wouldn't be a market for it anymore. It was pretty obvious that he didn't care about HyperCard or HyperCard-like products anymore. That, or he didn't want punk teenagers questioning his business strategy.
As a side note, I got my photo taken with him. When the photo was developed I saw that I was wearing a megawatt smile (shit, I met with Steve Jobs!), and Steve was looking distractedly at something in the corner.
I think much more likely explanations than the nefarious anti-creative one given for HyperCard's death are that, to varying degrees: Steve found all HyperCard stacks he saw to be messy and confusing, not at all the functional simplicity he was looking for; HyperCard was taking engineering resources and/or money that could be better spent saving the Mac, and therefore Apple.
Apple has not, in recent years, on the Mac, been against trying to provide simpler programming environments - look at Automator, (the now also defunct?) AppleScript Studio, or Dashcode.
Simple programming environments have a fundamental flaw: they fool you.
Let's say we have a type of perforated balsa wood that you can just snap into pieces and glue in place. Making a dog house just went from hours to minutes! Hurrah! So you start telling everyone that this is the new way to construct buildings, but then as you get bigger structures, it starts to fall apart.
Simple programming environments fool you into thinking you into thinking your projects can scale, and the result is a mess. Hypercard was fun, but it wasn't a deep paradigm, it wasn't good syntax, and in the end, I have to say that it was good that it died.
"Simple programming environments fool you into thinking you into thinking your projects can scale, and the result is a mess."
Doesn't seem to have stopped Excel in particular, and spreadsheets in general, from being wildly succesful.
Very good point. I think spreadsheets are just deep enough and map well enough to the domains that use them, that it can tread water. (Not swim, mind you.) They're also quite clear about their constraints.
Of course, from a software engineering perspective many things that are developed using spreadsheets are truly horrific - but the non-developers who create complex systems using them love them.
Spreadsheets have a much more brilliant potential than we currently understand. In fact, it's my opinion that in the future we'll be essentially coding in a spreadsheet; not a text editor. (That's the case with the project I'm working on.)
If you want to talk about this more, I'm david927 at gmail.
Simon Peyton Jones et al. wrote an interesting 2003 paper on extending Excel's "natural" mapping to functional programming with first class (i.e. cell-based) user-defined functions.
http://research.microsoft.com/en-us/um/people/simonpj/papers...
If less of our finance system depended on spreadsheets and were actually well understood and observable by properly designed software, the 2008 crisis could have been averted.
And why does it need to be "deep"?
It's good that it died because it doesn't "scale" to the elephantine size of software you're accustomed to?
>It's good that it died because it doesn't "scale" to the elephantine size of software you're accustomed to?
As someone who's dealt with accounting systems that have started out as Access + Excel + VB Macros, I can say yes, absolutely it is a good thing that it died because it didn't scale.
I was a fan of Hypercard all those years ago. But, in my opinion, that combination: lack of depth + poor syntax, it just wasn't a strong enough contender. It could be kept as an introductory technology, but even there, we can do better. The same fate happened to VB, Actor, Object Vision, Omnis, and so many others. It's a cold world if you can't keep up.
But notice that list. It's all efforts from the 80's and early 90's. We gave up at some point. There's no excuse for that. Shame on Apple, and everyone, for that.
Maybe I'm being too hard on Hypercard, but that criticism is only about that particular technology -- certainly not the effort as a whole.
> It's good that it died because it doesn't "scale" to the elephantine size of software you're accustomed to?
Does nobody else find this hilarious? I mean, the whole persecuted-Mac-fan vibe here; it's so 1990s it's almost a time warp in itself, except this time it's a persecuted-Classic-Mac-fan, and the main entity doing the persecution is Apple itself.
Steve Jobs demolished Apple and replaced it with Next.
Umm, SuperCard still exists and is under active development. Even back in the day it was superior to HyperCard in every way.
My guess was that HyperCard was killed because the WWW was coming. I used SuperCard pretty heavily back in '92ish at the second college in Minnesota to get the internet (MCAD - U of M was first). I remember the rows of NeXT boxes they had, the only machines connected and I remember fumbling around with building Gopher sites as well as some basic HTML hacking, as basic as it was back then. I lost interest in HyperCard/SuperCard shortly thereafter.
But it's spirit certainly lived on in Visual Basic, Borland Delphi, Macromedia Director and a bunch of other things. I don't know that this guy had exposure to any of this hence the short sightedness and, in my opinion, miscalculation of Job's motives.
> the WWW was coming.
In 1998?
> it's spirit certainly lived on in Visual Basic, Borland Delphi, Macromedia Director and a bunch of other things.
I have used all of these, and beg to differ. The spirit of HyperCard was that of radical simplicity, and it does not live in these systems.
> > the WWW was coming.
> In 1998?
I built my first HTML page in '93. NCSA Mosaic was the rage in the lab.
> > it's spirit certainly lived on in Visual Basic, Borland Delphi,
> > Macromedia Director and a bunch of other things.
>
> I have used all of these, and beg to differ. The spirit of
> HyperCard was that of radical simplicity, and it does not
> live in these systems.
???
Drag and drop, double click to add script, how is that not the spirit of HyperCard? HyperCard was the preeminent RAD development tool. Easier than Visual Basic? Sure, but not by much. The gap between BASIC and HyperTalk is not that great a leap. If you can't see the parallels between Macromedia Director and HyperCard, you're being blinded by your own weird sense of what's what - considering they both used the same basic language back then.
> Hypercard is different from systems like Director > in what isn't there: the cancerous complexity.
No offense, but you're reaching for straws now. Any stack of any complexity was just as complex as the equivalent in Director. I'm not sure what you were doing in 96 or so, but I was making a living with Director after cutting my teeth with some serious SuperCard action. I wrote a precursor to AIM/ICQ for our feeble AppleTalk network with SuperCard and that was not simple at all.
I guess if you were writing simple back and forth stacks, then yes I see your argument. But anything truly useful beyond that was just as complex as anything else. I appreciate your romantic notions to the contrary however.
I never used Director, so I can't speak to that. And, by '96, I had graduated from HyperCard into C and Pascal.
But: I spent a lot of time in HyperCard, and I can assure you that the field of HyperCard complexity was not limited to just "simple back and forth stacks" or "anything truly useful beyond that". HyperCard afforded a really nice way of gradually ramping up complexity and approaching complex-on-the-outside problems with simple-on-the-inside code.
I doubt I could remember any of my own HyperCard projects at this point if my life depended on it. But, I can tell you about a HyperCard project a good friend of mine did: he called it "MusicMaker", and it came with a piano keyboard, multiple synthesizers, and its own unique musical notation system which allowed you to easily translate any sheet music into text which the HyperCard stack could synthesize and play on the piano keys. It could teach people music better than just about any other piece of software at the time. This was not, as I recall, remarkably challenging for him, and we both had a lot of fun with it.
I think any network-related programming is going to suck. Tricks of the Mac Game Programming Gurus had an entire chapter devoted to it, as I recall, in which most of the chapter could be summed up as, "OpenTransport sucks". So, I don't think it's fair to use a networking-related program as an example of a challenging HyperCard (or SuperCard) project.
> I appreciate your romantic notions to the contrary however.
(edited for snark)
Please try to keep the snark to a minimum. Thanks.
Yes, and I wrote a MIDI sequencer in Borland Delphi in about 1997 and that wasn't particularly challenging either, in fact simpler than attempting similar in SuperCard: http://www.sonicspot.com/aliendiskosystems/aliendiskosystems...
The point is, people tend to gloss and shine and wax poetic about HyperCard and it's ilk, but if it was so profoundly simple and awesome, why aren't SuperCard and related (MetaCard or Toolbook anyone?) prominent development platforms? Because that simplicity becomes complexity once you cross a certain threshold.
> Please try to keep the snark to a minimum. Thanks.
Please keep attempts at editing people's personalities at a minimum. It's how I talk, it's how I write, I make no apologies for it.
Cheers!
The technical qualities of a technology are not the sole determinant of its success or failure.
> I'm not sure what you were doing in 96 or so
I was a boy. A boy playing with Hypercard. As hinted at rather transparently in the article.
> But anything truly useful beyond that was just as complex as anything else.
Complexity of interface is different from complexity back-stage.
Hypercard is different from systems like Director in what isn't there: the cancerous complexity.
Viola was publicly available in '92, when Mosaic came along in early '93 it was pretty clear to people working in the field that although what the Web did was a lot more limited than other hypertext tools the network effect made it infinitely more powerful.
HyperCard like systems are not dead. I make my living from LiveCode ( http://www.runrev.com ) which is a xTalk system that is able to build applications for Mac OS X, Windows, Linux, iOS and Android. From a single HyperCard-like environment, I can build apps for all these platforms and reuse the code.
LiveCode is very verbose and it may take a while for those that are already familiar with C-like languages but it is worth it. Software is easy to maintain and there is little space for confusion, even with few comments.
RunRev company also release a PHP-like engine, so I can create web applications that communicate with native clients on mobile and desktop all from a single language and environment, this is very powerful.
You should always use the best tool for your job. For my job, LiveCode fits perfectly. I also think that xTalk languages are a wonderful introduction to programming and people from non-technical backgrounds can learn it easily and start creating little tools to help their specific domains. Yes, anyone can learn programming in any language, it is just a matter of effort, I believe LiveCode makes this effort fun and productive.
You may read this as advertisement but this is my personal opinion. I am 31 and have been using LiveCode since I was 25 or something like that. In the meantime I got married, got my own place and am living a good life. All my work is LiveCode related. Just telling you that so you guys don't take me for "someone that toys with this cute language every once in a while but is not serious".
Even if you already have your favorite language, it is worth to check out LiveCode, just to learn what else is out there. There are beginner webinars starting next month I think...
oh... I forgot to tell, LiveCode can import HyperCard stacks... =)
A bit of trivia: MYST, the CD-ROM game, was first shipped for the Mac and written entirely in HyperCard (with a few custom extensions to load the color graphics as quickly as possible).
I think what killed Hypercard, in some sense, was what it made popular: like the Amiga, it was too closely tied to its hardware.
HyperCard ran black and white windows of 512*342 pixels because that is what the original Mac had (there was a version that did some color things, but that was and felt like a serious bolt-on job). Moreover, all graphics where bitmaps.
Bringing that into the 'real world' as it existed at the end of the eighties would have taken a lot of resources, and it was not guaranteed that the end result would still be HyperCard. For example, if one allowed resizable windows, laymen would have to learn about layout algorithms ('is this button 50 pixels wide, or a quarter screen wide? Is it 20 pixels from the bottom, 10% of screen height, or should its bottom border match that of that text field over there?').
Also, at the time, there were attempts to include some Hypercard/Director-like features to QuickTime. HyperCard was considered for that functionality, but did not make the cut.
I think that versions of Hypercard past v1.0 allowed for stacks of arbitrary window size; I know for a fact that I made up at least a few 1024x768 ones in v2.2.
You are correct in that I don't think it supported resizable windows without the use of some XCMD though.
You are correct. Surprisingly, I even found evidence for that: http://support.apple.com/kb/TA31048:
* All the cards in a given stack must be the same size, even if there are several different backgrounds in the stack.
* Smallest card size: 64 x 64 pixels
* Largest card size: 1280 x 1280 pixels.
FTA: "And if you think that XCode, Python, Processing, or the shit soup of HTML/Javascript/CSS are any kind of substitute for HyperCard, then read this post again."
I always hope in the back of my mind that someday there will be a way to create a web interface as easily as you could in Hypercard.
I miss it so much.
No such luck. The Web architecture inherently sucks:
Take that article with a large grain of salt. Much of it has been proven wrong by now. ("HTML is where the Web starts and probably where it will end.")
Take anything from loper-os with a massive grain of salt, he is a long-term troll who has been writing these sorts of wistful but ultimately empty posts for several years now. Calling everything and everyone stupid and bloated and yearning for the good old days is his thing. I only mention this because some HN readers might not be fully aware of his history.
The distinction between a "troll" and an actual human with unpopular opinions is lost on you?
You aren't the only one: http://www.loper-os.org/?p=91
I don't think loper-os is a troll. I think there are some fundamental social deficiencies at play (the profoundly arrogant and strange README in his SVN trunk is a must-see), but I believe he genuinely means what he says.
The sad part is that it that it hasn't fundamentally changed. Sure, there are layers upon layers of redundant redundancies to smooth out issues, but they haven't really been fixed.
Waterbear is something that is headed in that direction. http://waterbearlang.com/
Along these lines, have you seen http://springbase.com?
Actually has a really good example (a calculator) to show how HyperCard could be used.
HyperCard was a big part of my junior high years. It could even be extended, e.g. there was one add-on that allowed color graphics to be displayed and I learned to make some simple games that way.
But its not a good example.
You're sitting at a device with a numeric keypad, and a 9" screen; Replicating the calculator interface in clickable buttons, with a textbox as simulated LCD is a horrible misuse of the potential of a computing environment.
A screen, a keyboard and a hardware ALU. Being used to run an OS which draws some keys and a screen, interprets mouse movements, parses and interprets Applescript and parses text arithmetic operations, so it can pretend to be some keys and a screen connected to a hardware ALU.
And this is hailed by our "anti-bloat" author as a great example of simplicity which normal people love, and its limits are fine, compared to any other system - e.g. Visual Basic 3 - which is needlessly complex.
It's almost funny, until you read his seven tenets of computing and find that any program which encounters any error should enter a debugger, so you can fix it and carry on. I think that would drive anyone insane.
> It's almost funny, until you read his seven tenets of computing and find that any program which encounters any error should enter a debugger, so you can fix it and carry on. I think that would drive anyone insane.
You prefer inexplicable crashes? Now these are enough to drive someone mad.
And if you do prefer them, wire your debugger to a Blue Screen of Death emulation.
Have you never watched a video on your PC depute owning a perfectly serviceable TV?
Yes.
I have also loaded up an OS and browser built on a cross-platform GUI framework and a TCP-IP stack, and used it to do a DNS lookup and send a HTTP compliant request across a tangle of hundreds of miles of interconnected systems to Google's servers to statistically analyse my query, run it through around 700 servers, through an index of several billion web pages, and thousands of years of Youtube videos, millions of pictures, hundreds of thousands of news articles and twitter streams, ranking the results, and at the top putting the results of my calculation "2+1" or whatever.
But then, I'm not the author with the weirdo view of 'simple'. What's your point?
I don't think the calculator was intended as an example of a great program, or of the "potential of a computing environment". I think it was intended to be a very straightforward example of something that anybody could make as a practice project in HyperCard.
If you're new to programming, "make a calculator" is a great project. It involves simple math -- which you probably already understand -- simple operations, and simple concepts.
It also illustrates some of HyperCard's strengths. I can't think of any other development environment in which "make a calculator" would be a suitable project for the end of the first week of a programming class. And, in the end, the programmer gets something that they can look at and interact with, and which can be easily extended. ("OK, now make it do factorials!")
> It's almost funny, until you read his seven tenets of computing and find that any program which encounters any error should enter a debugger, so you can fix it and carry on. I think that would drive anyone insane.
I wish it worked that way!
Yes, if programs were constantly crashing into debuggers, it would drive people nuts. Absolutely, no argument there. But, I would hope that that alone would motivate programmers to make them crash less.
What we have instead are mysterious black boxes, and I hate that. But, I'm on the end-user support side of things. Here, let me give you some examples of stuff we've dealt with in our little shop in just the last few days:
1. An iMac that hates booting. Sometimes it boots, sometimes it won't. We invoke "verbose" mood, we get some sort of helpful text, and then it all goes away to a black screen. Or, it decides, "OK, all done with verbose mode!", switches to a gray screen, and hangs. Or, we attempt an install from any of our sets of install DVDs, and the most helpful error message we can dig out of any log anywhere is, "i/o error in dvd-rom" (or something similar). It is literally impossible for us to pinpoint the source of the hardware trouble without shotgun replacing every one of the major components in the machine.
2. A FreeBSD system that occasionally does a hard hang. No error log, anywhere. At all. Just halts. Hardware problem? Software problem?
3. An Acer Aspire One with all kinds of really unhelpful error messages in the system logs. Everything seems to be glitching everywhere. Ah, but Windows 7 helpfully generated a mini-dump file of just one of the crashes. Maybe we can track down a bad driver? Let's see, we'll just find and install dumpchk.exe and ... hmm, need to fix debugger symbols for this and ... wait, there's no stack trace? ... uhm ... oh look, it has a "probable cause" at the bottom: "hardware". That's helpful?
I know I'm forgetting some. Anyway, it's like this for us all the time.
I used to know & love MacsBug. Even with the old MacOS programmer's switch, I could occasionally figure out something useful. I would love it if, instead of calls like, "my computer is stuck at a black screen, should I reboot?", we'd get calls like, "my computer just went to this black screen that says null pointer exception at a bunch of numbers in iaStor.sys".
And, even better, if we were so inclined, we could notify software vendors of the specific errors we ran across, so we could do a better job of helping them track down bugs.
It also illustrates some of HyperCard's strengths. I can't think of any other development environment in which "make a calculator" would be a suitable project for the end of the first week of a programming class.
Visual Basic 3?
What we have instead are mysterious black boxes, and I hate that. But, I'm on the end-user support side of things. Here, let me give you some examples of stuff we've dealt with in our little shop in just the last few days:
I don't need examples - I wrote my comment on a phone, and twice it popped up Input system error. The system is being restarted, which, it seems, is a spell-check problem. And that's a system which does me more annoyance than good even when it's working normally.
We have a firewall with a subsystem which crashes when enabled, and the manufacturer says it's a known problem with no estimated fix time, the only option is to disable that feature - one of their main selling points of the device.
A web server, showing problematic memory use and no process using it.
GVim on Windows 7 x64 - I could (not sure if I can remember what it was now) repeatedly make it hang, not just itself but also the entire OS, with an operation on a large file which is instant on Linux. Yet Task manager shows no high CPU or memory or disk use.
Anyway, it's like this for us all the time.
Yes, but it's a fallacy to think I could practically do anything about it if only it wasn't so black box like. Do you really have time and motivation to become expert enough in all the things you deal with that you could fix any problem if only it was more explorable?
You may as well ask a restaurant if you can watch the staff cook all your food over their shoulders so you can intercept and change things if you think it's going in a direction you don't want.
Even if you are able to taste too much sugar in an Apple Pie, it doesn't follow that you could remove the sugar if only you were in the kitchen watching how much was put in (it's dissolved by then), and it doesn't follow that you could tell if too much was put in just by looking - it depends on how sweet the apples are, and whether the sweetness changes with length of baking.
See also: the idea of systemantics.
And what am I going to do when it's your webserver crashing, or your iCloud server, or DropBox, or if it's my car navigation system or my phone's spell checker?
But, I would hope that that alone would motivate programmers to make them crash less.
Programmers don't make software that crashes for fun, you know. If it took no effort, they would make programs which don't crash right now. But it does take effort, and that's a limited resource which needs to be traded off.
> Visual Basic 3?
Never used it, couldn't say. But, I did find some online tutorial type stuff for VB6: http://www.vb6.us/tutorials/little-more-advanced-hello-world...
I think we're going to get mired in semantics and splitting hairs. After all, "get name of me" doesn't objectively make any more sense than "lblHello.Caption". Still, I find HyperCard's interface much cleaner in general, and key phrases like "Private" and "Option Explicit" are to me huge red flags for a beginner language.
So, you're right that VB3 could be used for a "make a calculator" lesson in the first week of a programming class. But, so far, I think I'd rather do it in HyperCard.
Yes, but it's a fallacy to think I could practically do anything about it if only it wasn't so black box like. Do you really have time and motivation to become expert enough in all the things you deal with that you could fix any problem if only it was more explorable?
I think it's a fallacy to say you'd need to become an expert in all the things you deal with to benefit from the openness; you just need one person to fix it and distribute the changes. How many users benefit from CyanogenMod due to Android's openness, despite having never touch the source?
> Do you really have time and motivation to become expert enough in all the things you deal with that you could fix any problem if only it was more explorable?
Well, that's our job. And, I specifically gave examples of unexpected behavior, in the sense that these were things that shouldn't ordinarily be behaving this way. The examples I gave were of cases where it definitely would help to have better troubleshooting tools available.
> And what am I going to do when it's your webserver crashing, or your iCloud server, or DropBox, or if it's my car navigation system or my phone's spell checker?
Whatever it is that you do now?
> Programmers don't make software that crashes for fun, you know.
I didn't say otherwise. Not sure why you brought this up.
I guess I still don't understand why "crash to debugger" would be worse than current behavior.
> Programmers don't make software that crashes for fun, you know.
I didn't say otherwise. Not sure why you brought this up.
You implied that this would motivate programmers to make software which doesn't crash, as if software which doesn't crash is pretty much a choice they could make but aren't making. I'm saying programmers are already trying to make software which doesn't crash and it's just not that easy - If a user seeing an error message isn't helping, how will the motivation of a user seeing a debugger change things?
I guess I still don't understand why "crash to debugger" would be worse than current behavior.
It would be worse in terms of user experience; I'm sceptical that it would help a tenth as much as it is implied it would help - crashes being so deep, so numerous, so subtle, and so often due to some interplay between "working" systems.
So, wait for a debugger to load, then close it again, would be the default action. And that would be annoying.
No, I said crash less. I completely avoid any debates about "perfect" software, they're pointless.
Maybe I misunderstood you. When you said, "I think that would drive anyone insane", I assumed that for some reason we were talking about a greater quantity of crashes. (Might've been the "any program ... any error" bit beforehand.)
Here, let's assume we're stuck with two options, and for both options, we'll assume that the initial rate of crashes is the same:
1. Program crashes. There is an unhelpful error message, or no error message, or it hangs. User reports the error, or doesn't, and reboots.
2. Program crashes. Debugger immediately loads. User can contact support, or not. Support may receive actually useful information about the cause of the error. User reboots.
I don't really see why the debugger scenario is worse in terms of user experience, and I'm a loud and vocal advocate for improving UX. To me, it's clear that there are strong benefits and no additional drawbacks to having a drop-to-debugger policy for broken software, because even if an end user can't do anything with it, we might be able to, and even if we can't, the end user is no worse off. You specifically say "wait for a debugger to load", but I remember MacsBug being pretty darn snappy. I can't think of a reason why it would be necessary to wait for one to load, certainly to wait any longer than you have to wait for an unhelpful error message as it is.
And, if dropping to a debugger was really a terrible, terrible thing, then yes, I think application developers would put more effort into preventing it. I don't expect software to magically become flawless, but we run into plenty of errors that appear to be caused by a programmer making an effort vs. reward tradeoff in favor of letting the bug go.
I mean, you and I both gave examples of frustrating software problems without even trying hard. There are countless others. Either these problems are an inevitable and unavoidable consequence of complex systems, as you seem to be saying, or they are examples of people putting in less effort than they could. If it's the former, we are well and truly fucked, because software is only getting more complicated, not less. If it's the latter, then I don't think it's fair to argue against efforts to make software crash less. You can't have it both ways.
Anyway, I think we're getting sidetracked, and this little discussion isn't fixing any bugs. Suffice to say, I don't think that mocking the author for daring to suggest that programs should error-drop into a debugger is a very powerful criticism. I for one would really appreciate it if they did that, so I guess I'm as nuts as he is.
I was only mocking the author for wanting both ubiquitous debugging as a top-7 must have conputing feature, while at the same time decrying complexity and pining for a simple system.
I agree with what you're saying, broadly.
> I was only mocking the author for wanting both ubiquitous debugging as a top-7 must have conputing feature, while at the same time decrying complexity and pining for a simple system.
There is no contradiction between the two:
I had a sudden hypothesis after reading this thread last night. (No proof, mind you, just an idea.) I wonder if HyperCard's success was for the same reason as UNIX's success: a well-chosen, small set of interoperable tools that allowed the user to do far more than the sum of their parts. And, as computing and interfaces have become more complex, we've just tacked on complexity to them rather than reinventing them to adapt.
For instance, UNIX's treatment of everything as a file is a great tool, but I think there was a time where that simplified a much greater percentage of computing than it would today. Similarly, I suspect HyperCard died because the abstractions it was using had to be reimagined to stay competitive.
Spreadsheets (as mentioned elsewhere) are an example of a product that has managed to stay focused on the simple set of tools, and, when features were tacked on, they often stay out of the way instead of adding complexity to day-to-day operations. However, I'm not sure they've aged well, but I guess they are the best we have at the moment.
IMHO, the Linux programming environment is a mess precisely because not everything is a file handler - not even sockets, as even though sockets can behave like files, there are differences. Plan9 was supposed to be the reinvention of Unix and a lot of useful stuff was ported from Plan9 to modern-day Linux, including the /proc filesystem which is really, really useful.
Unfortunately the UNIX philosophy is so powerful that you don't understand it until you live and breathe UNIX. And it also doesn't scale for some use-cases, but this philosophy is the reason why UNIX is not only alive, but the dominant platform.
Also, HyperCard is NOT a "small set of interoperable tools". And neither is Excel.
Though I haven't done any Linux programming, I can definitely see your point. A abstraction seems only as useful as it is consistent.
You're right: HyperCard and Excel aren't a small set of interoperable tools like UNIX is. I would argue, though, that they are close: a walled garden of interoperable tools. Rather than guide you through steps to a specific end (the extreme example is a wizard for, say, a mail merge), they present you with tools you can use to get to that end. Obviously, they don't work particularly well outside their garden, and the great advantage of adding tools to your toolset is cumbersome at best.
I do wish I understood the UNIX philosophy better. Perhaps I'll learn more over time.
I loved Hypercard and BASIC too, but this post strikes me as being a little affected by nostalgia.
With Hypercard you could make something for someone who 1) has an Apple, 2) Has Hypercard installed, and 3) Has the right version of Hypercard installed.
With HTML/CSS/Javascript you can make something to run almost anywhere. It's not 'easy' but it is useful.
> With HTML/CSS/Javascript you can make something to run almost anywhere.
With gargantuan effort.
And your creation will still appear distorted in unpredictable ways to many users.
Don't forget, there's Hackety Hack http://hackety-hack.com/ and Scratch http://scratch.mit.edu/, two great tools for new programmers to get their feet wet and make something fun in the process!
There's no mystery, here. No great conspiracy -- the article has a screenshot of the smoking gun.
http://www.loper-os.org/wp-content/hypercard-calc/hc12.jpg
The scripting language had a verbosity and ambiguity that only a lawyer could love. The documentation for that language was similarly impenetrable to anyone with a background in CS, requiring a serious commitment to trial and error to perform operations that were trivial in BASIC.
It looks that way, but it is not that verbose. Compare:
withget name of me put the value of the last word of it after card field "lcd"
It looks verbose, and in some sense it is, but having lots of implicit state made it a nice environment for what it was.it = event.target.name currentCard.fields["lcd"].append(it.text.asWords().last())That is total conjecture and is not historically accurate. Hypercard was ridiculously popular and widely used amongst people who wouldn't ordinarily program (lawyers, historians, graphic designers etc). Hypercard was not aimed at CS graduates but even when if it was it comes from a time long before CS courses taught Java (And in many cases even C). This kind of verbose, human-readable scripting was exactly the sort of thing that a CS graduate would have experimented with along side the more traditional Pascal and Lisp.
Visual Basic played a similar role for me many years ago. I've moved on since, but at the time it was exactly what I needed. The value of a WYSIWYG GUI editor and a simple event system to a budding developer can't be overstated.
I think it'd be interesting to re-write HyperCard in javascript and HTML5, as a way to get kids interested in coding (and maybe use javascript instead of apple script as the card language?). Not sure how difficult this would be as I've never actually used it, only heard of it's majesty and wonder. Cool that you can still run it on an emulator. I'll have to try it out.
If the author loves HyperCard so much why doesn't he re-create it? Honestly, it would not take that long.
The answer is because he'd rather write some link bait trashing Apple and Steve Jobs.
And because he knows perfectly well that making HyperCard is not worth his time.
IMHO his calculator example makes it perfectly obvious why HyperCard not only was killed but deserved to die. Like all of Apple's languages of the time, HyperTalk is awful. Additionally HyperCard is not particularly flexible or elegant, nor does it produce particularly fantastic end results. It's crap.
And in a day like today where we have the web enabling people who have never even heard of a pointer (let alone had to deal with pointer math) to write full fledged, performant, distributed, multi-user applications, why does anyone in their right mind spend 5 minute bemoaning the loss of HyperCard? It's absurdity.
The code for this could have simpler/less verbose. It's been a while, but I believe in the number buttons this would have worked:
put the short name of me after fld "lcd"
and in the equals sign button: put the value of fld "lcd" into fld "lcd"
(edited to fix a missing quote)Thanks!
Also there is an extra eval (the first one is unnecessary, I think.) But I am too lazy to revise the screenshots.
I remember making a game with hyper card when I was at school and loved that I could visually script the buttons.
We hand drew pictures of 3D rooms and then scanned them in and put big invisible buttons over the doors etc.
I personally think the www killed hyper card, suddenly you could do very similar things but have them instantly shared globally.
In my mind hypercards real innovation was creating your own links which would take you to any other card you desired, I think plain old simple HTML came along and killed the poor thing. HyperCard lived in a time of share it on a disc and I imagine Jobs seeing it's days as numbered.
Bill Atkinson, the creator of Hypercard, is on record [1] wishing he'd had the foresight to extend the links to the network. Hypercard was inches away from being the first "web browser".
[1] http://www.wired.com/gadgets/mac/commentary/cultofmac/2002/0...
Hypercard was very interesting to me, enough so that I bought my first programming book, Danny Goodman's The Complete Hypercard Handbook. I loved tinkering about with Hypercard on my very first computer, a Powerbook 100 that I bought new in 1992.
I was sad when Apple decided to only include Hypercard Player with new Macs, which I also think ultimately helped contribute to its death. People liked Hypercard but not enough to pay money for it. By the time Steve killed it, it was a lumbering zombie of its former self.
The thing I've always felt held back these paradigms is something that is inherent in them, they are too visual.
That sounds crazy, but just look at this article and how he described what he did. He had to take a million screenshots to show it. The same is true for say, the Android app builder, or the Lego logic thing.. the way you sure code is via a screenshot.
And that just doesn't scale very well I don't think. It makes copy/pasting, the most basic way that people learn, really difficult.
HTML5 is pretty awesome, or rather WOULD be pretty awesome if we could erase the last 15 years and all start with CSS3 compliant browsers. But it is such a mess now that I feel bad for anybody starting from scratch.
But the combination of HTML for layout and Javascript for layout is pretty darn powerful, and I think pretty accessible as well. You could certainly build something on the order of the simplicity of Hypercard on that stack.
But I do like the author's comments about the shame that there are far fewer 'programmers' than back in the old days. BASIC on the AppleII is of course the other example (among others).. where it was just so easy and almost natural for ANY user to start getting a feel for things and hacking around. Losing that really IS a shame.
I remember using hyper studio (later version) in 5th grade in ~96. That was one of my earlier "programming" experiences and I spent a lot of time making a shooter called teddy bear doom for our class. However, the next year I found vb and found it a lot more flexible (I also decided I was a pc about that time)
The author makes an interesting point re: computers as bicycle vs train, that nothing like HyperCard exists today and apple now has a clear and hard separation between laying track and riding on it. I agree, but I wonder if HyperCard (or a modern version) makes the same mistake many wysiwyg solutions make: if you do something enough, you'll want more capable tools, and if you're a casual hobbyist, the wysiwyg solution will be too complicated anyway, with a disjointed interaction metaphor to boot. If I were a 5th grader today, I think I'd be better served by starting with python than something like hypercard.
The author makes a very valid point: the modern Way of personal computing is about the controlled user experience; allowing people to get their mess all over with their own work would disrupt that Way.
Don't appreciate the crack about the 'aspie software engineer' though. Definitely not needed.
Has this guy played with Interfave Builder lately? I fail to see his point. Apple has poured tons of resources into making development for its devices simpler, and that's part of the reason you have so many amateur apps on the App Store.
Jobs understood the division of responsibility between Apple, with its extremely limited resources, and its developer community. What wasn't core to selling computers was cut. We are talking about a company that was 90 days from bankruptcy when he took over. I think he made a good decision.
If somebody has wanted to do a proper visual programming environment in the vein of HyperCard for the Mac, what in the last 13 years has stopped them? It wasn't Jobs.
> Apple has poured tons of resources into making development for its devices simpler...
"Simple" and "pleasant" are different things. Digging a trench deep enough to bury a car is "simple."
> Has this guy played with Interface Builder lately?
Yes, actually I have played with it.
It makes me want to vomit. The whole NextStep stack does, in fact. Because I have used OpenGenera. And HyperCard.
> if somebody has wanted to do a proper visual programming environment in the vein of HyperCard, what in the last 13 years has stopped them?
Here's a example from an unrelated field. The Kalashnikov rifle is more than half a century old. Why has nothing replaced it as the world's most popular weapon of war?
Often, complexity and full-featuredness is precisely what people don't want.
Broken example - the Kalashnikov is still very much available, while HyperCard is at best a historical curiosity. And it _has_ been upgraded to the AK74, which means it's a quarter century. It also has been transformed into the Type 56, the SG550 or the Galil.
Where is that progress with regards to HyperCard?
Mikhail Kalashnikov did not regard the 74 as an improvement. The project was moved forward against his objections.
And HyperCard did indeed end up reincarnated as cheap imitations, just like the Chinese Type 56. The analogy holds.
Can't comment on the other two -- small arms aren't really my field.
Does that mean that it really wasn't an improvement? First you argue that popularity defines success, but then you switch gears and appeal to the inventor's opinion.> Mikhail Kalashnikov did not regard the 74 as an > improvement.Seems like the True Scotsman fallacy. If someone had upgraded the HyperCard, but the original creator(s) felt it wasn't an improvement, would you argue that it wasn't a "true HyperCard."
> If someone had upgraded the HyperCard, but the original creator(s) felt it wasn't an improvement, would you argue that it wasn't a "true HyperCard."
In fact, he implicitly does: http://en.wikipedia.org/wiki/SuperCard
This wasn't meant to be a general statement about the opinions of original creators.
I happen to respect that particular inventor (Kalashnikov). And I agree with the reasons he gave. (Going into them here would be going too far off subject. Google is your friend.)
Has this guy played with Interfave Builder lately?
This guy writes a lot of well executed but fundamentally vacuous articles; I wrote about a previous one here: http://jseliger.com/2010/09/30/computers-and-network-effects... last year. At this point, PG's essay: http://www.paulgraham.com/trolls.html applies to him. But you wouldn't notice as much from any individual article or submission; it's only through the collection that his tactics become apparent.
I think he's more of a crackpot than a troll (and I think the difference matters).
I am not a troll. Everything I write is deadly-serious.
And I do not submit my articles to this site, although I do read it when they inevitably end up here. Complain to the person who does.
He's kind of pissed off. I see nothing special with it except for the "natural" language.
Y'know, I can see where he's coming from. I'll get to that in a moment, but first: I'm starting to get a little sick of HN's tone lately. I'll readily admit that I've contributed to that in the past, but some of the comments in this thread ("person is a troll", "writes vacuous posts", "is a crackpot") are really over the top asinine.
He's a better person than I am, because if it were me, I'd already have a script in place that checked the referrer and would post a blank page with "Eat a dick" for anybody coming from HN. If nothing else, it would ensure that the submission wouldn't get as many upvotes (or would get flagged) and I wouldn't feel compelled to put up with the abuse.
Anyway:
So, IIRC, my progression as a young programmer went something like: BASIC on a Commodore 64 / Vic-20, to Logo on I-don't-remember, to HyperCard (and then on to QBasic and Pascal and C and C++ and OOP and on and on).
HyperCard was amazing because its barrier-to-entry was so, so low, and I agree with the author's comments that there still isn't anything quite like it. For one thing, it was on most Macs by default at the time, so you didn't have to find a copy of the software and install it first.
As he shows, it was a piece of cake to get started with. A budding HyperCard programmer could easily learn new tricks by downloading anybody's stack and reading the code. Since it wasn't compiled, you could learn from it. Anything that anybody else did, you could take apart, and learn how to do.
And it grew with you. You could make something as simple or as complex as you wanted. My very first, very naive foray into AI was in HyperTalk; I discovered I could write self-modifying stacks, and decided to see if I could teach a HyperTalk stack to talk to me like a person.
HyperCard also introduced me to online forums for the first time. I still remember, fondly, downloading J5iverson's XFCNs from eWorld. From there, I discovered the world of the early internet -- the alternative to BBSs. Whereas BBSs at the time let me easily chat with someone else in my town (or, more often, play TradeWars or something), eWorld let me chat with people "across the pond" for the first time. For a young kid, this was a life-changing, world-shrinking event.
I disagree with the comments that SuperCard is a reasonable alternative. I don't remember the details now, but while I appreciated the addition of color in SuperCard, it brought with it other complexities that I disliked. I played around with SuperCard but ultimately went back to HyperCard.
So, what I'm getting at is, if it weren't for HyperCard, I don't think I'd be a programmer right now. HyperCard was simple enough for a beginner, and rich enough to keep my interest. It was a huge influence on me. I really can't overstate that.
Almost two decades later, I was approached by an employer who wanted me to teach computer programming to his son. His son was young, not yet in high school, pretty sharp, and, y'know, geeky. Liked video games, liked taking things apart. Not exactly a challenging pupil in terms of motivation.
I spent a ton of time trying to figure out just what in the hell environment to use to teach him. JavaScript? You have to know a lot of other stuff before you can really begin to do anything of value in JavaScript. Before we could do the calculator example in the author's blog post in JavaScript, I'd have to teach basics of HTML, the DOM, and eventually we'd either end up using JQuery or going over the whole "browsers are different in how they handle the same code" discussion, which, honestly, is one of the most stupid problems in the history of computing when you think about it.
Anyway, I went through a bunch of options and finally settled on something called Kids Programming Language (or Phrogram). What a damned mess that was. It wasn't very long before I had to teach the concept of objects to the kid. And scope. And then half the time the entire environment would just up and crash with no helpful explanation. There's a fun problem for a newbie programmer: write some code, your environment crashes, no explanation.
If the field has improved much since then, I'm unaware of it. Not having something like HyperCard available to young programmers really is a tragedy, because it means that future programmers are going to be introduced to programming in college (which, often, is a joke, and IMO also too late in an individual's mental development), or they're going to have to tough it out through JavaScript or Ruby or Python or whatever the inscrutable popular language-of-the-month is, and that's going to really narrow down the field of people interested in getting into programming.
Some free market adherents might respond with, "Well, if there was demand, someone would build it, so obviously there's no demand". Honestly, I find that entire argument completely boring. It's clear that there was a lot of demand for it; did nobody want it before it was invented, and has nobody wanted it since it became no longer supported? I don't think so. I think there is a market for such a thing, and it just hasn't been built yet.
I really hope it will be. It's already on my list of near-future projects if nobody else does it first.
Finally, a breath of sanity on HN.
The "natural" language is actually the weakest part.
Because it really isn't natural at all, it only looks natural. AppleScript is a kind of COBOL. You have to know exactly what (natural-sounding) keywords and clauses you can use. Try some real natural language and the interpreter is suddenly lost.
Alan Kay's STEPS project drew some inspiration from Hypercard. I haven't gotten to try it, alas. http://www.vpri.org/pdf/tr2011004_steps11.pdf
Who remembers the HyperCard Smut Stack? http://finance.groups.yahoo.com/group/HyperCard/message/2809...
Hypercard was a wonder. I used to teach street people how to make simple, useful applications that they thought of themselves.
Sure it was limited in what it did, and that was part of its beauty. It was a tool for quickly hacking together a simple application. A tool that nearly anyone could learn to use quickly. And it was fun and spontaneous.
And the Hypertalk language was very interesting to work in. It was like a limited dialect of a spoken language; a pidgin for computers.
Still surprised that nobody's made the connection to FileMaker.
FileMaker actually predates Hypercard (depending how you define it), and is similar to the point where you can still find Hypercard->FileMaker conversion tools.
Filemaker is also screens + controls + simple scripting language (albeit not quite as easy to grasp as HyperTalk/Applescript), and is wildly popular for an audience similar to that of Hypercard users.
Does anyone remember publishing HyperCard stacks on the web with the WebStar mac web server? That was pretty cool! It even translated the controls and fields on the stack to html forms that worked in the web browser. http://143.50.28.215/HyperCGI.html
I think HyperCard was special in that other people's stacks were easy to come by, and easy to disassemble. There was no compiled distributable form, all stacks could be cracked open if you wanted to see how they ticked, and just modified just as easily if you wanted to see how the program's behavior would change.
Filemaker is similar to Hypercard.. http://www.filemaker.com/
You know what's actually the product I've met with the most similarity to HyperCard? Microsoft PowerPoint. You have cards, buttons that can navigate between them in arbitrary ways, effects that can trigger on loading or unloading a card, etc. And for everything else, you have VBA, which, like AppleScript, allows buttons and fields to do arbitrary things (with generic COM objects, even!) I once wrote a full-scale Dragon-Quest-style CRPG in PowerPoint, using the cards as a scene graph; it was actually quite friendly.
There's some truth to this (as appalling as the notion is). While I'm teaching Python to my 7th grade nephew his school is having him 'program' in Powerpoint. As dismissive as I am of the notion, it does allow students to quickly get some stuff up on the screen and start being interactive.
Fortunately my nephew has taken to Python and is even starting to become a language snob "That looks like a stupid way to program".
Hmm... putting PowerPoint and Python side-by-side reminds me a little of Shoes (http://shoesrb.com/), which is also something that "allow[s] students to quickly get some stuff up on the screen and start being interactive", but is purely code-based, rather than having any sort of GUI interface construction tool.
Quite a few things are detectably similar to Hypercard. Even MS Visual Studio.
None of them have the love-at-first-sight appeal. Most seem like an unbearable ordeal once you've used the real thing.
Wow, that brings back some happy memories. Writing Eliza style chat bots on my Mac SE in Hypercard was a lot of fun :)
Fun fact: the best selling game Myst was developed in Hypercard!