Sam Altman would like to remind you that humans use a lot of energy too
techcrunch.comCancer uses a lot of energy too.
His analogy reveals a glaring lack of intelligence and empathy.
First, an intelligent person would think of the consequences of such a statement.
Second, an empathetic person would think of the consequences that the AI resource drain causes on rest of the society.
86 comments: https://news.ycombinator.com/item?id=47112633
The optics are bad. Sam needs a good PR firm/coach. He also needs a Sheryl Sandberg in my opinion.
The message behind it is that yes, training a new model uses a lot of energy but the benefits are enormous once trained. I think his humans use energy too analogy likely came from him hearing messages about how training pollutes the earth. In his mind, he thinks training a new model is as useful or more useful than 3,000 people on average. Even if he is right, he shouldn't have used this analogy due to how sensitive people are right now with AI, energy, potential job loss, etc.
Having worked in San Francisco, I know how insensitive tech bros are to the outside world. Perfect example is the 2015 Airbnb ads: https://www.sfgate.com/bayarea/article/Airbnb-apologizes-for...
> In his mind, he thinks
If people have to analyse what "he probably meant" and how "he probably thinks" because what he says is completely stupid, to me it's not a good sign.
> Even if he is right
Well he is obviously not, his analogy is completely stupid. What you try to rationalise as being "probably what he meant" cannot exactly be right or wrong, it's a bit philosophical. If the goal is to get as productive (or "useful") as possible before we as a species completely collapse, then maybe training a new model is as productive as 3000 people on average.
Now if the goal in life is that most living things are better in the long run, being productive is the opposite of what works. Again we are living a measurable mass extinction right now, that is happening orders of magnitudes faster than the famous one of the dinosaurs. We are failing at surviving as a species, and we are so good at it that we are making most other species fail to survive.
If "his philosophy" (again, the one we have to interpret because he is incapable of articulating something that makes sense) is that "the goal of our species should be productivity and not survival", then I can confidently say I disagree. But that's already giving him a lot of credit from this analogy.
He's thinking like an AI CEO. I don't think his goal is to eliminate 3,000 human births so he can use the energy to train a new model.
In his mind, he must be thinking that I can train a model that uses as much energy as 3,000 humans so that it can be used by billions of humans. He thinks it's a positive statement.
Again, the optics are bad. Read between the lines. I can see how most people are offended by his statement.
I don't know what he is thinking. I know what he says. And what he says shows that he has no clue about the energy problem.
> it can be used by billions of humans
Unless we have an energy problem, which is precisely what he does not seem to grasp. People who are concerned about the energy consumptions are concerned precisely because it is a problem. And if we are really generous with his answer and start replacing what he said by what we want to believe he said, then the conclusion is that he is very naive and doesn't understand the problem at all. And that's the generous case where we completely ignore what he said.