Settings

Theme

University of Minnesota Gopher software licensing policy (1993)

nic.funet.fi

72 points by mkotowski 2 years ago · 36 comments

Reader

zoeysmithe 2 years ago

What a time capsule of how early free-ish software tried to monetize itself and the odd politics of University computer systems.

I was curious about Yen and saw a small bio about him:

https://chinacenter.umn.edu/umn-china/history/alumni/disting...

Its a bit interesting, and shocking, some of the members of the teams behind proto-HTTP software in the 90s were college graduates in the early 60s. If I'm doing my math correctly Yen would be in his 80's today.

  • hypercube33 2 years ago

    I never really questioned why it was called gopher until today...but it makes sense. (Goldy the Gopher is the university mascot)

thomastjeffery 2 years ago

The great mistake I see here: everyone worth discussing is an organization.

The best corners of the internet are not groups of people: they are collections of content. The content cannot pay you a license fee. The content cannot demand itself be constrained to non-profit ends.

  • kragen 2 years ago

    in 01993 only organizations had internet connections

    i mean in some places you could get a dialup slip connection from something like netcom, but you would put the files you wanted to share on netcom's ftp server, not your own

    • thomastjeffery 2 years ago

      Even if that isn't 100% ubiquitously true, it's true enough to provide important context, so thanks for that.

      My overall point doesn't just apply to this instance, though. It's something I see all over the place, even today; particularly in conversations about moderation and censorship.

      Content has been siloed off so intensely that it's hard to even imagine a modern internet without arbitrary borders. Most of those borders are made across organizational lines. They are often made out of copyright, with the notion that some deserving party will be monetarily compensated.

      Those borders usually don't align to the content itself. Instead, they become arbitrary hurdles, or even walls; making it unfeasible or impossible to truly benefit from content. Nearly every incompatibility in software was created intentionally, to cement and enforce these borders.

      Now inference models (overconfidently called AI) like LLMs are all the rage. What do they do? They draw new borders. What are those borders meant to align with? The patterns that are already present in the content itself.

    • MongoTheMad 2 years ago

      This is simply not true. My grandpa was an early adopter in '92 and had an internet connection to his home in Little Falls, MN.

      • kragen 2 years ago

        you're right, what i said was actually false, because there were a number of individual people who had their own internet connections. i've met some of them since then. but we're talking about maybe a thousand people out of the millions on the internet. i didn't know any of them in 01993. literally everyone i knew on the internet got their internet access by belonging to an organization that had internet access

    • bluGill 2 years ago

      A few people had t1 connections to their home, something i was jealous of, but no way I could afford the cost. By the time I graduated ISDN was available and my company paid for it so even affordable. DSL came soon after and was affordable to 'normal' people.

1vuio0pswjnm7 2 years ago

In his book about the early web, Berners-Lee claimed this is what sunk gopher.

["Weaving the Web" (1999)]

  • dwheeler 2 years ago

    I agree with him. It was huge news when the licensing model was announced. Many people had expected it to be freely available. The day it was announced was the day that Web growth truly began. The rats didnt just jump from the ship, they created new ships. I feel bad for Bob Alberti in particular.

  • kragen 2 years ago

    rohit khare's 'who killed gopher? an extensible murder mystery' points out several other factors

    https://ics.uci.edu/~rohit/IEEE-L7-http-gopher.html

meepmorp 2 years ago

I remember using Mosaic for the first time and thinking that it sucked ass in comparison to gopher - so much less information available, and it was very hard to just browse the hierarchy to see what was on a server.

On the other hand, I kind of miss Mosaic's ability to easily turn off image loading. There's more than a few sites that'd be improved by using that feature.

  • mrob 2 years ago

    UBlock Origin in advanced mode allows easy blocking of images, both globally and for specific sites. However, in Firefox this only works for the normal view; in Reader View and in the Page Info dialog box it will be ignored.

  • bombcar 2 years ago

    The HTTP web really took off with good search engines. Before that it was much more curated on other methods. Even ftp!

    • joezydeco 2 years ago

      Everyone forgets the "directory" era. There was a time when Yahoo was the primary way to find things on the web. Getting your new website listed there was like winning the SEO wars.

      • ghaff 2 years ago

        I don't remember the timeframes exactly but at one point, I had a local "home page" on my Unix workstation that was basically a graphic and the links I was most interested in. Yahoo was probably there. Search engines were coming in but I mostly bounced around until Google came along. AltaVista was early on.

    • meepmorp 2 years ago

      Back in the day, there were hosts on the internet that let you browse their entire filesystem via ftp. This was in the days before shadow password files were a thing, too. I'm too upstanding of a citizen to have done so, but a friend of a friend once spent a couple weeks of computer time running crack on those things and managed to gain shell access to some of the machines.

      • ska 2 years ago

        If you had such a ftp service, there was a good chance that eventually you'd end up unwittingly serving porn out of a twisted little maze of nested directories with embedded special characters an the like.

        This killed a fair few useful-but-not-important sites.

        • kstrauser 2 years ago

          At one point a well-known FTP server would let you access it with Samba, complete with R/W access to certain directories. I had an Amiga with a small hard drive, a modem, and the "VMM" virtual memory program. Experiments led to me creating a 2GB sparse file on the FTP server, mounting that server as a volume, and pointing VMM at that sparse file. Voila! An Amiga with 2GB of RAM, so long as you didn't mind swapping at about 5KB per second.

          That was completely useless, but it was great fun to get working at all. I hope my sparse file was actually sparse on the server, too.

          • veqq 2 years ago

            Tangentially, it turns out one could use late 70s technology to store 5gb of data on a VHS tape: https://www.youtube.com/watch?v=xSnrQBfBCzY

          • schoen 2 years ago

            > An Amiga with 2GB of RAM

            Huh, what did the Amiga's memory model look like? Could you construct a pointer to 2 GB different locations? Did it have segments like the 8086 or something?

            • kstrauser 2 years ago

              It had a 32-bit address bus, so that was a nice, flat 2GB of directly addressable locations.

              Edit: You might've been asking a different question. Toward the end, lots of Amigas had MMUs, either as a separate chip or built in to the CPU. VMM and similar programs used the MMU to implement paging.

              • schoen 2 years ago

                Those are both interesting answers, and I didn't really know anything about the Amiga's architecture (other than to have imagined wrongly that it might have had 16-bit addresses). Thanks.

        • shiroiuma 2 years ago

          Not just porn, but also warez (pirated software). In fact, I'd say warez was much more common than porn, though that might be because of observer bias...

    • kragen 2 years ago

      archie (ftp search engine) and jughead and veronica (gopher search engines) predated even the wwww (world wide web worm) search engine for the www

kragen 2 years ago

this is one of the major reasons i am posting this comment over a non-gopher protocol

mumblemumble 2 years ago

  > Remember when UNIX was given away free?
  > How many of you are using UNIX now?  It is licensed.
Fast forward 30 years... or 20. Ten, for that matter.
  • msla 2 years ago

    And the Unix variants people use (the Open Source BSDs, most Linux distros) are indeed legally not Unix, just imitations. And they helped kill Officially Licensed Unix mostly dead. Not just through price competition, but through adaptability and stability you don't get when the software is a Product owned by a Company with Executives who get Big Ideas and Grand Synergies. Usually, the most the official maintainers can do is tell you that you're on your own with your weirdo patches they'll never merge, but that's a lot better than a proprietary company saying it will never happen in this life or the next. Ownership can be death, as people don't like living under a sword of Damocles where someone else can unilaterally end what they're doing.

  • pasc1878 2 years ago

    Pedantically this is correct still.

    You are not using UNIX unless you paid for a mac or iPhone (or higher cost machines like IBM AIX et.)

    Linux is not a UNIX - it is not certified. Noone has paid for it to be certified.

    • mumblemumble 2 years ago

      That's kind of exactly what I'm getting at. Officially licensed Unix is all but dead. It was outcompeted by freer (choose your own capitalization, either works) POSIX operating systems.

snvzz 2 years ago

Kermit was similarly hindered by its licensing.

It is an open protocol with open source implementations now. Just a bit too late.

  • mananaysiempre 2 years ago

    The plain-language explanations in the book[1] are still awesome, though. The exposition is fairly faithful to reality and doesn’t really omit any details. A style from a different time that doesn’t seem to have survived in today’s popular writing—which has its own good traits, but not these.

    [1] http://bitsavers.org/pdf/dec/_Books/_Digital_Press/daCruz_Ke...

    • snvzz 2 years ago

      Agree.

      If you like this kind of text, you might enjoy "C Programmer's Guide to Serial Communications" by Joe Campbell.

slyall 2 years ago

Good article on the history of Gopher:

The rise and fall of the Gopher protocol

https://www.minnpost.com/business/2016/08/rise-and-fall-goph...

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection