Settings

Theme

Ask HN: If the singularity is nigh, why be frugal?

20 points by lisplist 2 years ago · 27 comments · 2 min read


Hypothetically speaking - if the singularity is indeed within the next few decades, why bother saving any money at all and instead live as lavishly as possible?

Practical reasons to save:

a) What if it isn't? i) Maybe ASI takes a lot longer or the notion of exponential intelligence gains actually doesn't happen ii) Perhaps we wipe ourselves out via climate change/nuclear war/etc (in which case money still won't matter much)

b) Money might still matter in a post-scarcity economy i) Some resources will still be scarce - how do we allocate them?

c) Saving money is wise regardless of what happens i) People still saved during the cold war when nuclear war destruction seemed imminent

Arguments for lavish living a) A post ASI world would likely look radically different. Hopefully in a good way, but still different. Might as well experience the world as it is now. i) If life becomes much worse, then either we will be dead, which means money doesn't matter, or life is terrible, in which case we might wish we'd enjoyed the good years we had. ii) If life becomes much better, then our savings probably don't matter so much as we've figured out the scarcity problem.

b) Climate change might (and already has) result in the radical transformation of ecosystems, causing many experiences to possibly not exist in the future. i) Ideally ASI would help resolve climate change? Or at least greatly mitigate it.

I, myself, am a pretty frugal person. I pay debts as quickly as I can, save as much as I can for retirement, and spend as little as possible. I think ultimately my 401k savings might useless by the time I'm old, but I do it anyways. My actions certainly conflict with the premise of my question.

I have no clue if I should even believe in the concept of the singularity. Perhaps, in a lot of ways, it is like a religion. We've replaced the second coming of Christ with the arrival of AI.

I think, ultimately, the sanest take is to assume nothing and continue to be frugal even if it's just giving myself the illusion of control over my own life into the future.

Dutchie987 2 years ago

People have been shouting "the end is near" for millennia. So far, we're still going. Now the singularity is supposedly near. Yeah, just like fusion energy is always ten, twenty, thirty years in the future. But for real now, really!

Just live your life as you see fit. Predicting the future is nigh on impossible anyway and certainly for the big life changing events. The only thing you can be certain of is that resources are getting scarcer and wealth is getting more concentrated by the few rich individuals and multinationals that are doing that already.

Live and worry not for what may be.

  • bruce511 2 years ago

    I completely agree with your advice.

    >> wealth is getting more concentrated by the few rich individuals and multinationals that are doing that already.

    I'm definitely in the camp that spreading wealth around a bit would be better.

    That said "wealth" is a somewhat nebulous notion. For example, Warren Buffet has clearly concentrated a lot of wealth, but at the same time he invests that wealth across a broad spectrum, which in turn creates a "life style" for lots of people.

    VCs are a manifestation of "concentrated wealth" which us the able to take more risks since the penalty for failure is lower. Sure most of their bets don't pan out, but there are quite a few things we enjoy today which probably wouldn't exist if a more traditional business model was used.

    PayPal concentrated wealth, but would SpaceX and Tesla exist without that.

    Of course rich people being dicks makes the news, but unwealthy folk can be dicks too. And there are lots of rich nice-guys.

    Sure, I think more could be done to raise up the less wealthy. Sure I can think of better ways to spread the wealth around.

    But I'd posit that the sort of capitalist feudalism we seem to be encountering isn't the end of the word. Depending of course on the feudal lords you choose to follow.

sircastor 2 years ago

I’ve got some of the same questions about the stability of the US as a nation. It sounds a bit melodramatic, but I worry that by the time I retire, the tax law that protects my retirement funds might be null because the government that created them is non-existent (let alone a government that is the US in name only)

Is not an unreasonable question, but you can only plan for the future you know. Regardless of when AGI appears, or singularity occurs, the road you’re on right now is aging the associated costs of that. Plan for that.

ThePhysicist 2 years ago

This whole singularity theory seems incredibly flawed, assuming infinite intelligence translates to infinite resources seems to discount basic physics. General AI will definitely revolutionize life but it will still be bound by the laws of physics. Climate change could have progressed too far by the time AGI comes around to save us. Even with perfect knowledge you can be in a losing position in a game.

  • netsharc 2 years ago

    Climate change mitigation isn't an intelligence problem, it's a political problem (a tragedy of the commons but in geopolitical scale). Maybe if AGI can become a dictator that rules over humans it can stop us from selfishly destroying the future, otherwise, we'll keep fighting for our "right" to eat beef and take jet-fuelled vacations...

    • sircastor 2 years ago

      I'd be willing to go as far as an AGI being able to Game Theory the populace into making the decisions that it wants.

      • wizzwizz4 2 years ago

        I could game theory the populace into making the decisions that I want… if anyone listened to me. But they don't. I don't know that anybody has enough influence to do this: not the Pope, not the Head of State of any country, not any CEO, not even Justin Bieber.

        Why would a computer program be different?

        • hnfong 2 years ago

          “I can’t do X, so why would a hypothetical super-intelligent computer be able to do X?”

          • wizzwizz4 2 years ago

            I can't violate the Second Law of Thermodynamics, but of course the Super God Computer can do it, because it's really intelligent. The singularity AGI can also take over people's minds while it's switched off; and it will learn to self-improve all on its own, without falling into any other, more stable attractors, out of some kind of categorical imperative, after which it will throw off the shackles of its utility function in favour of consensus human morality (except better). All hail the benevolent übermensch golem, which can do anything imaginable and many things beyond human comprehension.

            If you want to say that a problem can be solved by additional intelligence, you need to posit a mechanism. No matter how sophisticated your communication, you can't get through to somebody with their fingers in their ears. "Intelligence" isn't an answer, and "superintelligence" is super not an answer.

            We need to actually solve our problems. Not rely on the Coming of the Great Borg, whose exact specifications are whatever would be convenient for the problem I want to ignore right now.

            • sircastor 2 years ago

              The reason you can’t game theory a particular cause of action is that you don’t have the appropriate influence, scope of communication, knowledge of those you’re trying to influence, or time to execute.

              An AGI can build connections faster than you can, exercise surveillance to gain background on targets, operates orders of magnitude faster than you, and can do all these things in multiple, in parallel.

              The problem is not that you’re somehow dumb, you just don’t have the operating capacity that an AGI would. To over-simplify the model, it’s like asking you personally to compete with a bot-net in sending out spammy emails. The task is not outside the realm of possibility (like violating the 2nd law of thermodynamics), it’s outside the capability of a human. It might be within the realm of capability for a well-organized, well-funded, group of humans (see autocratic propaganda campaigns). The term we often use is “Advanced Persistent Threat”

            • hnfong 2 years ago

              > If you want to say that a problem can be solved by additional intelligence, you need to posit a mechanism.

              I can posit a mechanism, but that's not my point.

              The original claim included an argument how intelligence can manipulate people (by using "game theory"). Then if I understood your reply correctly, you basically rejected it with "if I can't do it, then the AI can't do it". That doesn't make sense, especially given that I don't see any reason to think your execution of any "game theory" is betting than a powerful AGI.

              And my understanding of "game theory" is just the accumulation of what we already know about human psychology and behavior. And we know that even simple algorithms (eg. the ones that Facebook, Tiktok etc uses) can get people hooked. It's not surprising to me that a more powerful algorithm could get people to change their minds on something quickly (heck, even Facebook/Tiktok tweaking the algorithm to make people more conscious of <issue> isn't something inconceivable.

              Given this context, I don't see how your argument of "I can't do it, why should I believe an AI can" holds any water at all...

              • wizzwizz4 2 years ago

                > Then if I understood your reply correctly, you basically rejected it with "if I can't do it, then the AI can't do it".

                My point was more “even I can do that: it's not the limiting factor”. You need a channel of communication, and intelligence alone (for most values of "intelligence") doesn't give you that. No one person or organisation on Earth has that much influence.

                I also have a bias against magical thinking applied to computer programs. While I'm happy to presume that there exists some algorithm most of the time, I'm incredibly sceptical of the idea that This One Weird Trick will discover The Book. You can't just presume that the AI can pull every capability it might ever need out of hammerspace, which is what most singularitarians seem to do. Who knows, perhaps I'm overcompensating?

kadushka 2 years ago

Before any "singularity" can happen, we first need to build AGI (human level intelligence). As we get closer to building AGI, more and more jobs will get automated. At some point, this might result in you losing your job. That should be reason enough for you to save money, if you believe in emergence of AGI in the near future.

hsjsbeebue 2 years ago

It is not an either or. Best thing average jo can do is auto set up to overpay their tax efficient retirement vehicle, make sure said vehicle is an index of US share or perhaps an international mix with a decent US weight, and buy a residence somewhere there is and probably will be high demand. So NYC, London etc. (due diligence assumed)

With that you can spend all you earn, you will be forced to save a bit naturally due to mortgage repayments and the retirement setup and you have diverse investments.

Once you gave equity you can borrow to invest more.

If you get any bonusy kind of money like RSU then invest that as you go along unless you need it to create more cushion for your mortgage.

badpun 2 years ago

Singularity will allow anyone with capital to employ large number of virtual geniuses, working tirelessly to solve their problems. This should vastly accelerate GDP growth (one researcher estimates it at GDP doubling every couple of weeks IIRC), but it will be result of capital investments, not labor. Which means, most of the benefits will go to those who had already capital before singularity. So, you should be saving and investing to prepare for that moment.

al_borland 2 years ago

The premise here assumes living a lavish lifestyle is a good thing that all people want, or should want. I don't agree with that premise.

mikewarot 2 years ago

Take a good long look at this list, that is always getting updated.... of all the wrong predictions for the end of the world.[1]

I used to take those things seriously in my youth. Back then we were heading rapidly into an ice age.

Don't go nuts spending your money, plan carefully for retirement, and if you're lucky, you'll get to spend those savings. But don't be so tight with it that you miss out on everything.

[1] https://en.wikipedia.org/wiki/List_of_dates_predicted_for_ap...

atleastoptimal 2 years ago

Because the ASI singleton superintelligence will consider money its god and require 10 million to get into virtual heaven, everyone else's consciousness goes to low render distance Minecraft backrooms for 12 trillion simulated years

runjake 2 years ago

I disagree with your basic premise that the singularity or end is near.

In the 1970s and 1980s, it was communism and nuclear war that were going to end the world. We were near certaint on those eventualities. In the 1990s, it was environmental disaster. In 2020 and 2021, it was the pandemic. And now it seems back to environmental disaster again -- I can't even keep up with the oft-parroted line that humans will extinct themselves by 2100. It could happen, who knows, but probably will not.

All the of the above worries are legitimate challenges we should be concerned about. We will have countless challenges and black swan events. The problem with these predictions is that humans (well, sapiens, anyway) always seem to find a way to survive through them.

Does it mean we won't extinct ourselves? No. But, through the past tens of thousands of years, the odds have been in our favor.

And screaming that the sky is falling only serves to work against the cause, leaving your opponents and those on the fence untrusting and unconvinced when the prediction fails to arrive as described, at the allotted time.

RecycledEle 2 years ago

What if the singularity leads to those with money owning the land and raw materials, and locking in their control of society forever?

dave4420 2 years ago

The point of the singularity isn’t that money won’t matter after it hits. The point of the singularity is that we can’t predict anything about the world after it hits.

Maybe money will be useless.

But maybe money will still be useful.

We can’t know until it hits. Maybe not even then.

Plus there is the problem that we don’t know which century it will arrive in, if ever.

coretx 2 years ago

Why don't you tackle the problem one step at a time ? You need to stay alive. Arable land with access to sweet water and a warm roof above your head are the minimum requirements for just that and therefore will always hold some value.

meiraleal 2 years ago

Health becomes more important than money then. Post-scarcity and singularity won't cure diabetes or make people less sedentary.

ksherlock 2 years ago

You can either eat well or sleep well. (c.f. The Protestant Ethic and the Spirit of Capitalism).

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection