Settings

Theme

I spent an evening on a fictitious web

paul.kinlan.me

74 points by kinlan 2 years ago · 38 comments

Reader

lewispollard 2 years ago

So the purpose of the website wasn't clear to you, you then figured it out, decided to write an article about it, but didn't explain to the reader what the website actually is?

  • kinlanOP 2 years ago

    That's a fair point. I chose more so to document the experience and how I felt.... I can update the article if that helps.

  • nuancebydefault 2 years ago

    Well, if you review a movie you try not to spoil by explaining its plot?

latexr 2 years ago

I went to the website, clicked on the search bar, and was immediately stopped from proceeding unless I provided a Google login.

Why is that necessary? No idea, they don’t say.

Doesn’t seem like a “web” I’d want to partake in, with Google as a gatekeeper even if they’re not the authors of the content.

  • sva_ 2 years ago

    Websim utilizes LLM to generate a fictional website based on a domain name you provide, and displays the result in their virtual browser.

    LLM generations are quite costly so it is difficult to provide them without some kind of anti abuse strategies in place, so I think that's fair.

  • ThinkingGuy 2 years ago

    Yeah, the requirement to create a Google account was a show-stopper for me.

  • GaryNumanVevo 2 years ago

    They're just using Google for an easy user onboarding, nothing to get all up in arms about. Websim is trying to sell plans for more than 30 generations a day.

    • jeroenhd 2 years ago

      Makes a little sense to show a "register with Google" button, but clicking the main UI element and being redirected to a Google sign-in screen is bad form.

      Also weird that apparently you can sign in via Discord, but you can only sign up via Google?

GaryNumanVevo 2 years ago

No more dead links, instead Chrome will instead hallucinate what that website should have looked like.

  • GJim 2 years ago

    > Chrome will instead hallucinate what that website should have looked like

    I find the term 'bullshits' to be more apt than 'hallucinate'.

    As something of an aside; that the purveyors of AI use the latter, whereas those who interact with it use the former, speaks volumes.

    • sva_ 2 years ago

      I find the term 'bullshitting' less fitting as it seems to anthropomorphize LLM in a way that attributes agency to them which they seem to lack. As in, someone who bullshits presumably does this for some personal gain, which doesn't seem like something an LLM is capable of atm.

      It might (currently) be most apt to characterize these occurrences as shifts out of the training data distribution

      • burner_fyllms 2 years ago

        About the same time that LLMs were starting to make the news, I was spending a lot of time with an elderly relative with severe dementia, and was struck by the fact that LLMs are doing the same thing she is: the word is "confabulating", meaning to come up with stories and rationalizations to fill in gaps in knowledge and memory.

      • WorldMaker 2 years ago

        The equal problem with "hallucinate" is that it also has far to many anthropomorhic connotations (a person having creative fun, a person on some form of drugs, a person in some sort of "sleep state").

        So far I'm coming around to the growing use of "slop", originally meant as an alternative to "spam" and to imply spam-like intent, but the great thing about this word choice is that the closest anthropomorphic connotation is to "pig feeding". Pigs can be highly intelligent, of course, but that's not the first image one has when thinking of a pig at a slop trough.

        • patapong 2 years ago

          I quite like "confabulation" as a term.

          From Merriam webster: to fill in gaps in memory by fabrication

          • WorldMaker 2 years ago

            "Confabulation" also sounds too anthropomorphic to my tastes. Especially because "fabrication" often implies "intent to" by the actor in question. It's the exact same problem as "bullshit", just the G-rated grandiloquent version. To be fair, human languages were built to anthropomorphize almost everything so finding the right terms here is hard.

      • hunter2_ 2 years ago

        > someone who bullshits presumably does this for some personal gain

        That's one definition of bullshitting, but not the one being used here. If someone says "I think you're bullshitting me" then yes, you're being accused of consciously seeking personal gain. But if someone says "we were standing around bullshitting" then no, it refers to killing time with mindless communication, which is a quite good analogy for LLM output.

        • jrm4 2 years ago

          I actually like your first definition a bit better; very in line with the way the term was academically in vogue a few years ago; the idea that you're expressing information intended to appear factual without regard for how factual it is.

          The LLM does it because it's programmed to, and the human does it for some other self gain reason, but both the process and the results are very similar.

          • hunter2_ 2 years ago

            > the idea that you're expressing information intended to appear factual without regard for how factual it is.

            That's my second definition! Sorry if I wasn't clear. My first definition (which aligns with the comment I had originally quoted) is that the speaker is aware that they're saying false things, and therefore has intent to deceive, typically for personal gain (they are bullshitting another person). My second definition is that the speaker has no regard for whether what they're saying is true or false (they are bullshitting with another person).

            An LLM does not bullshit you, it bullshits with you. It's fluff, not a bluff.

      • djeastm 2 years ago

        >as it seems to anthropomorphize LLM in a way

        "hallucinate" does the same thing, fwiw

        • krapp 2 years ago

          It seems impossible to come up with language to describe why LLMs are both convincing and unreliable ("hallucinate","confabulate","bullshit) or why the ability to converse in natural language does not denote intelligent cognition ("stochastic parrot") without anthropomorphizing them to a degree, given that these things are designed to anthropomorphize themselves.

          • burner_fyllms 2 years ago

            Yes true but saying "displaying the inherent flaws in the design that make them unsuitable for serious purposes" every time gets tedious.

        • vundercind 2 years ago

          “Hallucinate” connotes consciousness and self to me. Bullshit does not. Markov chain text generators bullshit, they don’t hallucinate. I’m not aware of anything in LLM tech that warrants implying any sort of awareness, understanding, or consciousness. Not even close.

  • kinlanOP 2 years ago

    It's not Chrome hallucinating, it's the websim that is generating the content.

    • rng-concern 2 years ago

      I think they mean a hypothetical future of chrome browser, not what websim currently is.

  • nxobject 2 years ago

    Thought experiment: think of a potential usecase (perhaps offline) where some smart-aleck product manager thinks this might be useful.

    • GaryNumanVevo 2 years ago

      Come to think of it, I'm kind of surprised that Google Chrome doesn't have a "this link is broken, would you like to see what we have cached for this url?" feature.

superultra 2 years ago

This is interesting but if you’re genuinely interested in recapturing the feelings many of us had at the beginning of the web, I would suggest playing Hypnospace Outlaw. It’s of course quite different than websim but it’s really fun.

  • csixty4 2 years ago

    I just watched a Youtube video of it and I DEFINITELY need to check this out. They seem to have captured the feel of the old web perfectly.

  • autokad 2 years ago

    I struggle with the idea of monetization. In one sense, I think its great people can get paid for doing what they like to do and it can encourage more content creation. On the other hand, everything becomes disingenuous, people become perversely incentivized, you got people gluing things to turtles to make videos of them rescuing turtles with 'lichens' on their shells.

    So I don't know, I am really torn on how I should feel about it

teg4n_ 2 years ago

I wonder why they aren’t straightforward that this is just ai generating websites based on a url prompt? It seems like they go out of their way to not say AI.

jschveibinz 2 years ago

You know, there a lot of negative comments here but I found this post to be enlightening, so thank you.

This YT video explains how this simulation could be useful to the startup community:

https://youtu.be/pdWS-ZJ3K8Y?feature=shared

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection