Settings

Theme

Some thoughts on social networking and Usenet (2018)

jfm.carcosa.net

88 points by donut 3 years ago · 80 comments

Reader

commandlinefan 3 years ago

Part of the problem with Usenet is that it isn't "free" like Twitter and Facebook are. (They're free because they host ads, and there are all sorts of problems with that, of course, but you don't pay a monthly subscription to them). Way back, it used to be you got an NNTP server bundled with your ISP which was ok, because you were the only customer who knew what that was and how to use it, so you weren't creating a lot of load on your ISP. Even then, the NNTP server didn't carry much content, wasn't well maintained, didn't save much, so you had to pay more for a third-party server if you wanted to participate.

My hope for Usenet back in the day was a fully decentralized implementation that would allow each Usenet client to act as a client as well as a mini-server to cut the middleman out. Freenet (and I2P, I think) was sort of based on this idea at a very high level, but went in sort of a weird way (and didn't build on NNTP either).

  • dsr_ 3 years ago

    In 1996, my local ISP ran Usenet on the beefiest PC-platform machine I had ever seen: a dual Pentium-II 400 with 128MB of RAM and 6 9GB SCSI disks. At times Usenet ate most of a T1 (that's a 1.5Mb/s pipe).

    So if you want Usenet, right now, every $5/month minimal VM that I'm aware of has more than enough CPU, RAM, disk and I/O to support you and two dozen friends, as long as you don't take binaries froups.

    • throw0101a 3 years ago

      > At times Usenet ate most of a T1 (that's a 1.5Mb/s pipe).

      How much of that was due to alt.binaries.*?

      • readingnews 3 years ago

        All of it. Like 90something percent of it. And I think you meant to say alt.binaries.pictures.erotica.*

        I was the sysadmin at an ISP in the 90s also. We spent the majority of our time talking about that NNTP server. Is it worth it? How do get purchase more disk when the SCSI bus is full and disks are crazy expensive? How do we get more CPU? How do we keep the load down at sync time? I think we spent more time talking about that than playing doom.

        Still, Horny Robs BBS files were all on alt.binaries.pictures.erotica.hornyrob (or something like that) and well, that kept a LOT of customers happy, so we keep that pile of servers running.

      • indymike 3 years ago

        > How much of that was due to alt.binaries.?

        Most ISPs stopped carrying .binaries.* groups because of this.

        • commandlinefan 3 years ago

          I wonder how expensive it would be with today’s disk prices - I can’t imagine binaries.* have grown that much in size themselves.

      • jeremyjh 3 years ago

        My first thought too! Probably 99%.

    • Narishma 3 years ago

      > In 1996, my local ISP ran Usenet on the beefiest PC-platform machine I had ever seen: a dual Pentium-II 400 with 128MB of RAM and 6 9GB SCSI disks.

      That's a feat, considering the Pentium II wasn't released until 1997, and the 400 MHz version late 1998.

      • dsr_ 3 years ago

        That might have been the successor, then. You'll forgive my memory at this range.

  • darrenf 3 years ago

    > Way back, it used to be you got an NNTP server bundled with your ISP which was ok, because you were the only customer who knew what that was and how to use it, so you weren't creating a lot of load on your ISP. Even then, the NNTP server didn't carry much content, wasn't well maintained, didn't save much

    In the UK, Demon Internet in the 90s maintained a usenet server for customers and one for non-customers (pubnews.demon.co.uk). They carried plenty, had decent retention, and were very well maintained. And busy too. Demon were the largest consumer ISP at the time IIRC. So, at least over here, what you say doesn’t match my recollections.

    (I worked at Demon in 1998; if either news server went down I had to fix it if I could, or raise 3rd line expertise otherwise, 24/7)

  • Mc91 3 years ago

    > My hope for Usenet back in the day was a fully decentralized implementation that would allow each Usenet client to act as a client as well as a mini-server to cut the middleman out.

    Gnutella started as this, but for a lot of reasons it didn't work for what it did, with every node sending broadcast to every other mode. The network became a number of federated super-nodes, or ultra-peers, or whatever you'd call them. These would be nodes that had a decent uptime, had a steady network connection, could handle a number of connections etc.

  • fweimer 3 years ago

    Google still offers free Usenet access, but alas not over NNTP (I think).

    These days, with reasonably cheap virtual servers and reduced Usenet traffic, you can just run your own server after finding one or two peers to exchange articles with. The Debian packaging of INN is quite good, I think. It's still a bit of work to set up things, but so is reading and writing articles.

    All of this applies to the text-only Usenet. A lot of for-pay Usenet was actually about access to binary-only groups with content of questionable copyright status (or even legality, depending on country). I don't know if that's still a thing today. The bandwidth requirements for these binary-only groups could be significant.

  • jhallenworld 3 years ago

    Sounds like NNTP combined with BitTorrent..

    Back in the day I remember dreaming ways to optimize the NNTP server- surely there was a better way than storing each article in a separate file.. but actually it was a very convenient to have a shell account on the same system that ran the server. This is why I enjoyed TheWorld (Software Tool & Die). You could cat the articles if you wanted.

    /usr/spool/news, C-News, innd (Rich $alz!), it's all coming back..

  • LinuxBender 3 years ago

    You are right the decent high retention + SSL NNTP providers cost at least $6/mo [0]. There are some free ones [1] but I have never tried them. I've heard they are sometimes slow and lack SSL. I liked the speed of Giganews for binaries but had problems canceling my account and had to call someone.

    [0] - https://www.techradar.com/best/best-usenet-providers

    [1] - https://teddit.zaggy.nl/r/usenet/comments/6l8h82/free_usenet...

    • fweimer 3 years ago

      I think news.individual.net only charges 10 EUR per year, but is strictly text-only of course.

  • bbanyc 3 years ago

    The lack of a business model is why any attempt to revive Usenet or run a new federated protocol like Mastodon is probably not going to get anywhere. Somebody's got to pay for it, and the quality of discussion isn't nearly good enough to attract paying customers. (The binaries may be, but isn't it silly to pay for piracy?) Ad-supported services have bad incentives all around, but nobody's ever come up with an alternative way to pay for it, aside from being small enough that voluntary donations will do.

  • rjsw 3 years ago

    There are free servers, they don't hold binary groups but they work fine.

  • icedchai 3 years ago

    There are free Usenet services out there (like aioe.org) if you don’t need binaries. Activity is pretty low though.

  • akadempythag 3 years ago

    The payments make sense, though. The market for Usenet providers and indexers is very capitalist.

    You try to find the organizations who can catch and index posts from the people you're interested in, at the lowest cost.

    It is uncomfortable, but I think the fediverse may need to be more ephemeral than most centralized social networks today. If you shout into the void and nobody hears you...a centralized network will archive your thoughts, but a decentralized one will let them die out as soon as your server does.

    It may not be a bad thing. People will re-upload useful content and personally, I'm glad that my 12-year-old self's online musings are lost to the wind.

LastTrain 3 years ago

Any of you archivers out there know whatever became of the Deja News Archive after Google bought it?

https://www.wired.com/2001/02/google-buys-deja-archive/

This was integrated into Google seearch, it worked great for many years, and then it seems Google slowly chipped away at access to the archive and now it all but gone.

I know about the various Usenet archives on archive.org (utzoo (rip), some CD usenet archive), but I want that Deja archive!

  • secabeen 3 years ago

    It's all still there, if you know how to dig. Here are posts in rec.arts.drwho from before 2004-11-02:

    https://groups.google.com/g/rec.arts.drwho/search?q=before%3...

  • squarefoot 3 years ago

    Every time one finds a USENET post about a good product or service, is one less ad impression and potential clicks on a website; in other words, USENET couldn't be monetized, so it had to go.

    Google literally embraced and extinguished it by first incorporating the message base into the web search, but apparently that wasn't enough, so they at first hid the "Discussions" search option, so that users had to enable it either by some search string fu or with 3rd party browser extensions, then killed it entirely. From that moment, a search that once would return links to people discussing X now would mostly return links to companies selling that X. The rest is history.

    https://www.seroundtable.com/google-discussion-search-dead-1...

  • indymike 3 years ago

    Deja News -> Google Groups.

pmoriarty 3 years ago

I so miss killfiles and scorefiles, and it's mindboggling how 30+ years after Usenet clients had them they're still virtually non-existent in any social media readers that I'm aware of.

Usenet clients gave their users so much power, it's almost criminal that modern news readers have yet to catch up to them several decades later.

  • PontifexMinimus 3 years ago

    > Usenet clients gave their users so much power

    And there's the problem: when you give users power, it's harder to sell their eyeballs to advertisers.

  • fragmede 3 years ago

    Third party killfiles exist for Twitter. I was made aware of this via one particular topic. https://twitter.com/erininthemorn/status/1566988180678758400...

  • TacticalCoder 3 years ago

    > I so miss killfiles and scorefiles, and it's mindboggling how 30+ years after Usenet clients had them they're still virtually non-existent in any social media readers that I'm aware of.

    Indeed and I don't think people realize how great these were. There's some kind of fallacy you often read when you mention that, which to me can be summed with the following bogus reasoning: "Usenet got killed by the Web, so anything Usenet and Usenet clients had were wrong ways to do things".

    Killfiles/scorefiles were better than any system I've used since then. Nothing even comes close.

  • EricE 3 years ago

    The analogs to killfiles do exist in some clients: https://tidbits.com/2015/09/04/how-to-mute-unwanted-tweets-i...

iisan7 3 years ago

Thanks for posting this. Everything old is new again. With buzz about moving to Mastodon, I was just thinking about why usenet couldn't be a viable social media platform as it represents so much of what people like about the fediverse, and then some. It would be fascinating to see the usenet protocol repurposed and new clients and newsgroups built upon lessons from the past decade.

  • floren 3 years ago

    It's pretty easy to set up your own INN2 servers and have them exchange their own hierarchy. However it also feels like there's just a ton of features and options in there which mostly make it very challenging to set up the simple case (and leave you feeling like you've left holes open). I've never really dug into the NNTP spec but it seems like it might be pretty easy to write your own implementation which just focuses on the common use-case.

  • donutOP 3 years ago

    > Thanks for posting this. Everything old is new again.

    Right? It was such a good time :-) You're welcome.

    > why usenet couldn't be a viable social media platform

    I wonder if identity plays a role here. Centralizing points (likes, retweets, etc) drives people to work on getting attention with their posts, to drive engagement, but also invites troublemakers and controversy.

  • 8bitsrule 3 years ago

    Coincidentally, me too! Getting rid of exchanging binaries would keep traffic quantity much more reasonable (and civilized). Federation lets everyone see all the messages (no signing up-for). Named-topic forums is much better than searching for tags. (TIL also that Thunderbird has News built-in!)

desiarnezjr 3 years ago

Google in many ways helped sink USENET by acquiring Dejanews, then slowly replacing the USENET functionality with Google News. Dejanews, in hindsight could have been a pretty reasonable proto-social network on its own.

Not that I'm bitter...

  • EricE 3 years ago

    It's not an accident they sunk it :(

    • desiarnezjr 3 years ago

      Nope. :(

      To add some context to those not from that era. Dejanews was the only practical way to search USENET at all, so it was effectively a USENET search engine with a crude web client. It was pretty darned crude, but was also much easier than the many NNTP clients. Most people I knew used both pretty effectively as USENET grew.

      It was simple and functional, only to be slowly eroded by Google News.

      Have no idea where Google News is now. But Dejanews was in many ways a Reddit for that era.

      • giantrobot 3 years ago

        To expand on Dejanews' search (for the uninitiated), Usenet servers had limited retention. That was the length of time they'd keep a message on the server before deleting it. If I sent out a message a few months ago and you just checked the server today, it's entirely possible you'd never see my message. You might see more recent replies to my original message but your server might not have my old original message.

        Because every Usenet server carried copies of messages each one determined its retention time based on the amount of storage the machine had.

        Dejanews imported a bunch of old Usenet archives and ran their own servers which subscribed to everything and indexed it all. They were to Usenet what Google was to the web.

      • secabeen 3 years ago

        Still there, if you get your search terms right:

        https://groups.google.com/g/rec.arts.drwho/search?q=before%3...

robomartin 3 years ago

Some of the most interesting and intellectually interesting conversations I have every had online were on USENET back in the 80's. I literally made friends all over the world in fields spanning areas from model aircraft to electronics, software development and aerospace. Met a bunch of them. Visited each others' homes. Etc.

Today? Geez. No comment. Every meaningful relationship I have online is from people I knew and met in real life, some of them a decade or more ago.

  • Melatonic 3 years ago

    I remember as a little kid getting some dude in Australia to somehow call us ( I believe over some type of IP phone or something) and he put the phone up next to the australian motoGP FM radio announcement. This was by far the best way to get the most up to date announcing on races. Now its so easy !

kkfx 3 years ago

> Killfiles and Scorefiles

That's one of the most important thing most people ignore: they means we can have our PERSONAL aggregator instead of using someone else algorithms and censorship. Sure at usenet time (or back at Xerox time where these concept was implemented for the first time in known history) the level of scoring and self-censoring technique was limited but nowadays it's a CENTRAL point. Usenet is a decentralized network no one own, no one can really censor at a whole and aggregators who are needed for anything high volumes are personal things, posts can be archived locally so they do not disappear and so on.

Coupled and integrated (like Gnus offer) with RSS feeds mails and in the case of Gnus also HN (nnhackernews backend) or Reddit (nnreddit) we can have a CONSISTENT and LOCAL UIs for ALL our public information/communication infra in a robust yet simple manner. That's the classic nuclear war resilient internet vs the modern centralized and censored web.

Oh BTW usenet today it's almost abandoned, but some have rediscovered it for mostly piracy as an alternative to bittorrent. It's relevant because it means that while normally binary groups in a modern world are a bit odd they perform well enough for such big file sharing usage.

Or, long story short: evolving these tools we can have a modern classic desktop who happen to be a human PERSONAL exobrain, work desk, tool, with the human at the center. With the modern web and relevant WebVMs we get instead modern dumb terminals of modern mainframes.

Do you prefer owning nothing "and being happy" like the infamous WEF/2030 video OR you prefer own your small slice of the world peer between peers?

  • donutOP 3 years ago

    > have our PERSONAL aggregator

    That's the dream, isn't it.

    So if there are 100 people whose messages you willingly watch, how do you discover new people? (Maybe you do so organically when they mention them?)

    And if you _do_ allow yourself to taste from the firehose of unfiltered messages, a single personal list of users/messages wouldn't scale and you have the classic spam problem. Do you share your rules with others (and form an aggregation like https://www.dnsbl.info/)?

    Curious what you think about https://atproto.com/:

      Algorithmic choice
      
      Control how you see the world through an open market of algorithms.
    • kkfx 3 years ago

      I answer indirectly: you probably know that most "manual tagging" is hyper-efficient compared to automatic one, and you probably here polemics about YT, Meta etc algorithms who push people toward extreme positions, apparently in neutral manners like yes, some video push toward political extreme right or left, but also some young girl video push toward younger and younger and so on..

      So how to discover people? Well, in Usenet that happen at a slow but effective peace: you start to choose some groups you think to be interested in, the total number of groups is not that high and names are sufficient for a mere full-text search in a hierarchy. You start participate and others members write about some other groups "you might ask there", "try here" etc. In modern terms this is Reddit subs with a bit more discoverability and less abandoned groups. Since Reddit today seems to be effective enough that skilled people add "reddit" to most google search queries to find better contents...

      Then the spam problem: at Usenet eternal september time antispam was limited, nowadays simple filters suffice AND we can import another piece of neglected IT evolution: the concept of PGP/GNUPG with their chain of trust. Or some will keep rotating usernames, some will keep them for decades. Those can share public keys signing each others in a classic chain-of-trust exchanging spam data automatically from their own client. Such approach is limited, but it's still better than actual "lists" for instance. It's still a cohort of people who decide BUT such cohort is not a for-profit company.

      Long-story short even with current progresses expert systems are FAR to be near the human selection quality and actually it's possible to get partitioned human selection shared spontaneously to others. This is probably slower in discovering new stuff but bring most high quality results and avoid certain derives, so in the end is better. Actual volumes of posts often named "infodemia" needs to slow down and going up in quality. Eternal september do not show the limit of Usenet but the limit of a certain tech, we can plug-in more to surpass them and keep evolving instead of keeping reinventing the wheel with company-startup-alike experiments who keep duplicating similar concept just varying a little bit.

      Competitive means from Oracle/CIA "different teams one against/semi-isolated to the others" produce some results, but a classic endless evolution of Lisp/Smalltalk systems the history prove produce MUCH better evolution because do not stop diversity, but allow diversity to merge and emerge. ALL "recent" IT evolution prove that countless times. ALL "recent" scientific trends do the same.

fweimer 3 years ago

It's a bit strange how this article applies the past tense. Gnus still works fine, can be used to read mail, and despite some questionable design decisions such as line-based .overview files, can handle quite a bit of mail with its nnml backend. I don't use that much of Gnus functionality actually, but the threading, disappearance of messages I have read, and the ability to score down the occasional high-noise thread is well worth it. I cursed a lot when we switched to Gmail at work and the auto-creation of mailing list folders no longer worked (it's no longer predictable if list mail has a List-Id: header due to internal Gmail deduplication), but I found a reasonable solution to that as well (which I probably should write up and post somewhere).

Some even gate mailing list mail into a news server like INN so that they can keep using their favorite newsreader. Thankfulyl, with Gnus, that isn't necessary.

papito 3 years ago

I feel like there is way more talk now in general about the current broken state of the Internet, corporatized and designed to inflict productivity-destroying scatter brain on all of us.

I started to aggressively use newsletters and digests of all sorts, for one. Politics, technology, what to stream. It is a SUCH better experience and a time saver.

donutOP 3 years ago

How was poster identity handled in Usenet/NNTP? From what I remember, it was just a "From:" header and spoofing was easy. Or was there more to it? (Maybe because most posters wrote to their local server which require auth, you could see which server the message originated from and decide if the sender's address matched the server address...? It's been so long.)

If not, then Twitter, Mastodon, etc. all seem to have a somewhat strong notion of identity.

  • indymike 3 years ago

    > If not, then Twitter, Mastodon, etc. all seem to have a somewhat strong notion of identity.

    Some of us would sign Usenet messages with PGP/GPG to deal with this. Lots of user didn't know what to do with the "geek code block" at the bottom of the message.

  • manv1 3 years ago

    It was the old internet. There was no identity verification.

    One day identity verification will occur at the network level. Until then, we have all this half-baked shit.

  • SoftTalker 3 years ago

    > From what I remember, it was just a "From:" header and spoofing was easy.

    Yes. Same was true of email back in the day.

jrnichols 3 years ago

I think these are very valid thoughts, even today. I loved the old days of telnet & Pine, and then Netscape Navigator with built-in email & newsgroups. Even Outlook Express did a decent job with NNTP.

but the endless flood of spam was too much, along with the ever growing risk of "illegal content." Just became way too risky, ISPs started dropping it, and the spiral continued.

amadeuspagel 3 years ago

> My opinion of conversation views is that they became widespread because they are relatively easy to implement, and because the threaded interfaces in early 2000s graphical email clients were terrible.

No, it seems more likely that it's because they're easier to display on mobile, so mobile first design leads to using them everywhere.

antod 3 years ago

I seem to remember one reason usenet worked so well was the pretty stringent insistence on good netiquette. Of course it wasn't foolproof, but nearly all people wanted to follow the implied rules most of the time.

Todays internet demographics and social media landscape though probably makes Eternal September pale into insignificance.

  • pixl97 3 years ago

    I agree. While spam and bad actors did exist, the crowd size was so small most of the time and the amount of spam tended to be far more limited and typically had a human at the other side posting the crap.

    If you open up an HTTP POST api these days that publicly listed the amount of crap that will be posted to it is legion. You'll have spammers and spammers bots hit it endlessly. You'll have broken scripts pound at it till the end of time. You'll have clever users figure out how to turn it into a data storage API. You'll have every dark thing that hides in the shadows of humanity use you as its new cave. If you allow binary uploads then that API is now a porn site or serving warez. The people pushing the most questionable stuff will come from a vast range of proxied IPs and infected jumpboxes.

    It's been some years now since I've been in charge of managing servers that require public facing internet presence and I cannot tell you how glad I am because of it. Attempting to maintain operational integrity on the open internet is like attempting to maintain structural integrity in a blast furnace.

Melatonic 3 years ago

Anybody have any modern recommendations for file sharing on usenet? I have heard there are a lot of obscure stuff still there that you will not find anywhere else (not even specifically talking about pirating here - legal stuff as well)

amadeuspagel 3 years ago

> Do you see how different this was from the infinite scroll of current social media web sites and apps? The idea was that your newsfeed would be updated periodically. Maybe hourly, maybe only once a day. And the goal of your newsreader was to let you be caught up and finished before the next time it updated.

How is that desirable? Why would you want to finish reading something that might already be obsolete?

  • rout39574 3 years ago

    The idea is that you control it. You're presented with the place you stopped following the thread, and you can certainly skip to the new hotness if you like, but if you want the intervening context it's right there.

    That "maybe obsolete" state is -your- reading state.

EricE 3 years ago

He's spot on about the quality of clients for social feeds. Tweetbot is the only way I use Twitter. Cuts out all the algorithmic manipulation of my feed and has excellent controls to allow me to parse my feed and weed out the garbage. There are many tech people I wouldn't be able to follow if I wasn't able to strip out the political garbage.

LastTrain 3 years ago

Usenet worked until the demographic changed.

halfbrite 3 years ago

Anyone looking to create a new social network should certainly take cues from Usenet - of course, it was eventually over taken by forums on the web.

Visiting Usenet today though will only lead to sadness, unless you're looking for something you shouldn't be or enjoy endless spam.

neilv 3 years ago

In addition to the `trn` and Gnus readers that the article mentions, another great one was `strn`.

They were much more powerful than any other forum interface I've seen since. Though the very compact thread views like in `old.reddit.com` and HN are nice.

  • fweimer 3 years ago

    Uhm, do you mean slrn? Never heard of strn (but that doesn't mean anything).

  • manv1 3 years ago

    Don't forget Nuntius, which was the best NNTP mac client. MT-NewsWatcher was pretty good too.

    Good times.

donutOP 3 years ago

The timeliness of this article is that it compares Usenet to Mastodon.

mjmsmith 3 years ago

nn[1] commands are still like muscle memory 20 years after I stopped using it.

[1] http://www.nndev.org

LAC-Tech 3 years ago

Usenet sounds great! Is it actively used still?

mattl 3 years ago

2018

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection