Ask HN: Will AI make us unemployed?
I can’t do without the help of AI in my daily development work. I use ChatGPT, GitHub Copilot and various AI-driven design platforms. AI can help me improve my efficiency by at least 30%. For some HTML pages or pure algorithm modules, AI code accounts for 80%, and only a small amount of modification is required.
The workload of two people before now only requires one person. If this situation continues, will AI make more people unemployed? Is it possible to develop into AI to write all the code in the future? I do embedded development, I've tried a ton of models and all of them end up hallucinating and wasting my time with the very obscure and undocumented things I work with every day.
Maybe if you're in webdev land, AI will replace you, I don't know
I have as much hope in AI as I had in website builders and no-code platforms Do you include good examples and documentation for context, since they do not seem to include much training data directly related to what you’re working on? I have mixed to positive-ish results with that approach. I mean how am I supposed to give it a 10k-50k page TRM? I would much rather read the TRM and understand things myself than have it spit out a questionable answer that I still have to double-check to make sure it hasn't hallucinated. Both Claude and GPT-4x excel in embedded development. While they aren’t perfect, it’s evident that if the current trend continues, they could potentially do my job within five years. AI is useful even in embedded development, but it certainly won't replace you No. We are in an hype fase. VC's pump money into it because it is the current hype, before this it was NFT's and crypto. The fase "AI" or a better name for it is LLM or GenAI will also pass. VC money will run out. There is no real business model yet that can support the enormous amounts of resources needed to train it and to run them. Hardware is not fast enough yet.. it could be sustainable when it runs locally on my hardware. Till that time I see this as another hype we shouldn't pay a lot of attention to. Speaking of ‘yet’ - how does the rapid trajectory of increasingly efficient models like 4o mini fit with this take? I sure as hell hope so. If you are a framework junkie that cannot write original logic or standup some small original application you are already largely irrelevant but your employer just hasn’t determined how to replace you. Everybody else pays the price for that inefficiency. Software is littered with developers that cannot program. If there is some magic solution that makes these people suddenly irrelevant the world is in a better place. Consider the manual transcription jobs that instantly evaporated because of photocopying. Who is “us”? All software engineers are different. AI reduces toil and only dumb work (and even struggle at this often in real world). So if you’re doing dumb work (because of legacy, bureaucracy, poor management or any dumb reason) — yes, you’re going to be obsolete sooner or later. “Good” news are a) ai needs another major jump or two to be realistic and scalable tool for dumb work e2e. And those steps are not yet invented. B) bureaucracy and poor management is stronger than the strongest ai, so there is always be some vacancies even for dumbest work Marshall Brain explores this idea in a short story, Manna. It is scary, and prescient. https://en.wikipedia.org/wiki/Manna_(novel) available free online from the author: https://marshallbrain.com/manna1 > I can’t do without the help of AI in my daily development work. I use ChatGPT, GitHub Copilot and various AI-driven design platforms. AI can help me improve my efficiency by at least 30%. For some HTML pages or pure algorithm modules, AI code accounts for 80%, and only a small amount of modification is required. If you replace "AI" with "StackOverflow", would you still feel the same? AI, unlike StackOverflow, won't call you names and XY you. Yes, this is a major plus point about LLMs which you can ask really stupid questions about things you know nothing about and get a starting point. I'm doing this with Bard all the time. At some point in time, AI will become mighty. In certain areas it is already. For example, predictions of failures of machinery.. weather forecasting (Google has a weather model), etc.. what you are talking about as "AI" is LLM. LLM is not AI. It's a part of a lot different technologies and techniques. Also know as machine learning. So please, keep it divided. LLMs won't replace you in this round at statistically composing text/code. But may be in the next. The best thing about this, still a human is needed to do the reasoning instead of the LLM. But when this can be done (f.e. with a combination of different ML techniques and LLM) then we have to talk what can be done about it. We can talk now, but each solution for the question "what will happen to working people, when AI rise" is just a game of thoughts and in the end lead to the only conclusion: You get rid of it and work further OR you stop working and let AI do it for you. It will be in the time of the transition, surely a lot of people will become unemployed. But at some point, the whole economy and money does not make sense anymore. Just imagine AI produces you the parts you need with a 3D printer on your sign. So when we come to this point, we've transformed as society so much, that your question is not important anymore. (I think in Europe it will be started early to find solutions to the threat, a AI tax or something like this. With this AI tax the people who lost their jobs will be supported.. Amerika, India, Afrika (as example) are the countries where a lot people living in that care by themselves for their lives. No helping state. There will be big problems, as no social/governed safety net existing.) > AI can help me improve my efficiency by at least 30% How did you measure it? Did you measure the efficiency that Intellisense, codegen, tools, etc.. provided to you before ChatGPT/copilot and what would be the delta? Someday - yes. But for now, the rate of improvement of AI has slowed down. They all show that they have beaten another benchmark, but there's no breakthrough similar to GPT-3.5. No. AI will make most of our current work, Yes. Then we will figure out what we do that AI can't do. That's how tech evolves since always Bluntly, Yes. There will be less jobs out there for software engineers since AI can pretty much do them for cheap and very fast, OpenAI talked about driving down the cost of intelligence, and now this is what is happening for all models that are being released. Right now LLM’s like GPT-4o and Claude have gotten to the intelligence of a mid-level level engineer. The company I was in was so fascinated by this that they closed job openings for junior employees, laid them off, and are thinking about laying off some senior engineers after carrying out a trial of AI tools like cursor, copilot and others. I can only imagine the same thing is happening across other companies and especially startups that cannot afford many senior engineers, they will just use AI and only one senior engineer. Expect more layoffs and less job openings unfortunately. Counterintuitively decreasing the cost of development might actually drive up the demand. A lot of potential software currently can't be cost effectively developed - so it isn't. More startups with a single senior engineer (augmented by AI) becomes possible at a lower price point. We might actually end up with more work. And an altered kind of work > Right now LLM’s like GPT-4o and Claude have gotten to the intelligence of a mid-level level engineer. LLMs have no intelligence at all, they don't reason. > LLMs have no intelligence at all, they don't reason. So the benchmarks (1) of LLM's like Claude 3.5 are categorically flawed at reasoning and they will never get better? Yet their outputs are heuristically good enough to reduce jobs from engineers by driving the cost of code generation down to almost 0. If we take the idea that you believe that it cannot reason at all in any capacity, as whole it would be regretful for you to bet against these systems not getting better or that another breakthrough would change the perspective of a new generation of LLMs. Anthropic is an AI company, it's evident that they want their system to seem intelligent. That's what they're selling after all. > it would be regretful for you to bet against these systems not getting better or that another breakthrough would change the perspective of a new generation of LLMs. Now you're putting words in my mouth. The results are impressive for sure, although I don't think we have seen any major breakthroughs since their original release. Just small, incremental steps. I'm not convinced that'll get us to reasoning capabilities but I don't mind being proven wrong. Predicting the future is hard. AI or not, has there ever been a 50 year span in human history where the majority of jobs didn't get redundant? Technology evolves. Newer jobs spring up to replace the ones that were lost. People adjust to the new reality, and our lives are better because of it. > AI or not, has there ever been a 50 year span in human history where the majority of jobs didn't get redundant? Yeah, many. For tens of thousands of years you were either a hunter or a gatherer. Even after agriculture shows up something like 12000 years ago, the job of farmer, craftsman, soldier, etc. were pretty stable. Job churn really only begins to happen with industrialization, and that process is only something like 200-400 years old depending on where in the world you are. Yes software dev has 10 good years left > I can’t do without the help of AI Yikes. Maybe you shouldn’t be in software engineering? Not a great look, professionally. I can smell your fear. The no-code/low-code/offshoring/nearshoring phases we’ve been through and the number of hallucinations I’ve seen are the best job security I could ever ask for. I see. You know how to look great, professionally. Look? Do. The reason all the above fails (and why AI dependent developers will fail) is quality of work product. someday but not too soon. i fucking love claude 3.5 but i still catch it slippin. as johnson wang of nvidia said, ai wont replace you, but you will be replaced by people who uses AI. i think the former will eventually come, but the latter is already here.