Settings

Theme

GPT-4's Has Been Revealed

thealgorithmicbridge.substack.com

14 points by sciolist 2 years ago · 6 comments

Reader

dkjaudyeqooe 2 years ago

The title is missing a word, it should be: "GPT-4's Secret Has Been Revealed"

skilled 2 years ago

To summarize,

the author took what George Hotz said (8x200b models) and put it together in an article that talks absolute nonsense for the entirety of its 1,207 words.

  • theolivenbaum 2 years ago

    I almost stopped reading after reading Hotz was a serious source

  • magospietato 2 years ago

    Glad I'm not the only one with this take. Assuming it's even true, orchestrating the interactions of 8x200b models seems like a fairly significant advancement in itself.

    My feeling is that the next generation of AI is going to be modular: connecting disparate models serving different purposes to generate a processing pipeline to achieve a singular goal. This, and the new function calling API seem to indicate this is an area of interest.

    • MuffinFlavored 2 years ago

      i’d like to understand what’s different between the 8 200b models

      what are their categories/specialties?

ryanschaefer 2 years ago

Has any work been undertaken to classify training data into groups then classify prompts against the same criterion and send that prompt to the corresponding domain specific model?

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection