Settings

Theme

GPT inputs outgrow world chip and electricity capacity

astralcodexten.com

19 points by quirkot 2 years ago · 14 comments

Reader

fulafel 2 years ago

Badly editorialized title.

Regarding the substance of the article, the curve from the 3 data points (1: 100x 2: 25x 3: 25x) could fit lots of different ways besides "growing 30x per generation".

danielscrubs 2 years ago

Really wish AMD could hurry up and implement a transparent module for gpu computing into llvm.

It’s not good that GPUs are this opaque.

dekhn 2 years ago

The are all the same arguments that wrongly predicted that DNA sequencing would overtake hard drive storage capacity.

People aren't going to do truly uneconomic things just to scale language models exponentially.

marsissippi 2 years ago

Interesting Pull quote:

GPT-4 needed about 50 gigawatt-hours of energy to train. Using our scaling factor of 30x, we expect GPT-5 to need 1,500, GPT-6 to need 45,000, and GPT-7 to need 1.3 million.

RecycledEle 2 years ago

New technologies experience very rapid exponential growth for a while.

This should not be a surprise.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection