Settings

Theme

Ask HN: How to measure how much data one can effectively process or understand?

18 points by mbuda a month ago · 9 comments · 1 min read


Is there a scale of how much data one can effectively process, something similar to the "Kardashev scale for data"? What would be a name for such a thing? During Memgraph's Community Call (https://youtu.be/ygr8yvIouZk?t=1307), the point is that AgenticRuntimes + GraphRAG moves you up on the "Kardashev scale for data" because you suddenly can get much more insight from any dataset, and everyone can use it (a large corporation does not control it). I found something similar under https://adamdrake.com/from-enterprise-decentralization-to-tokenization-and-beyond.html#productize, but the definition/example looks very narrow.

mbudaOP a month ago

Here are clickable links: https://youtu.be/ygr8yvIouZk?t=1307, https://adamdrake.com/from-enterprise-decentralization-to-to...

allinonetools_ a month ago

Interesting question. In practice, I’ve found the limit isn’t how much data exists but how much you can turn into action without friction. The clearer and faster the feedback loop, the more data you can effectively “use,” regardless of volume.

mikewarot a month ago

The limiting factor would be the density of information in the source material, followed my the cognitive impedance match of the receiver.

Fir example, a correct grand unified theory isn't useful if you don't know the physics to understand it.

rgavuliak a month ago

I would measure data by time to action. If you're not actioning data it's worthless.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection