Settings

Theme

Ask HN: Has AI/LLMs turned you off of tech?

30 points by softirq 2 years ago · 33 comments · 1 min read

Reader

I have been a passionate programmer from high school to my mid-thirties and for the first time in my career I find myself pessimistic about the future of our field to the point that I have lost my spark.

I have always loved correctness, cleanliness, and understanding how things work. Companies are racing to integrate tools that remove all of those attributes from my job. LLMs constantly spit out false information, poorly written and buggy code, and their inner workings are a black box of statistical knobs. They are a tool that encourages bloat and turns the job of the software engineer into a code checker watching over the shoulder of an intern.

I’ve realized the age of discrete, deterministic computing is coming to a close. Due to the tantalizing notion that programming can be commoditized and outsources to a machine, I know corporations will continue to pursue this avenue in full force. It’s put me into a real depression, and I wonder if I’m the only one.

runjake 2 years ago

50s here. No, it excites me. I use it to augment my skills, not replace them.

(It does scare me in several contexts but society will have to substantially adapt and work through that, after a lot of suffering. The genie is already out of the bottle, after all. Might as well deal with it.)

logicalmonster 2 years ago

We're in a new trend cycle.

Whenever some tech becomes trendy and people think they can make money off of it, it becomes a stupid buzzword that gets used just about everywhere unnecessarily.

Think back to the "cloud". When "cloud" became a reality, every single dopey company was using it as a buzzword term everywhere and it started feeling cringe.

Crypto hasn't yet quite reached those absolute heights, but think back a couple years ago where you started seeing "blockchain" in many dumb startups' marketing just because.

We're at the stage with AI right now where idiots are trying to shoehorn it everywhere in dopey ways just because, and it feels annoying.

I bet in a few years, after a lot of the bad companies who use AI in dumb ways die off and some new technology is the trendy new thing, I bet AI is going to start to feel a lot more technical than a stupid trend.

  • paulcole 2 years ago

    > Think back to the "cloud".

    Yeah thank goodness those days of the “cloud” are over.

  • sircastor 2 years ago

    I remember early on in the “cloud” hype my brother had bought some hosting, and I kept going over it thinking “this just sounds like a server…” I was sure I’d gotten something wrong about it, but in retrospect it really was just a fancy server package.

  • AznHisoka 2 years ago

    Totally agree. It’s like people are treating AI as a hammer and everything is a nail. Not every single problem needs to use AI but all anyone can think of is AI when there’s a problem

nonrandomstring 2 years ago

To be fair, the demise of correctness, cleanliness, and understanding how things work, began long before the latest round of AI.

It really just amplifies already degenerate trends in software engineering.

  • emmet 2 years ago

    I feel this way about the internet being taken over with user generated content.

    It’s strange taking information at face value, and then when you come across something you’re deeply knowledgable about and it’s horribly incorrect yet has thousands of upvotes, you start to wonder just how much you’ve read is actually junk info.

    Little wonder that LLMs can’t get things right when they’re working off knowledge bases of absolute shit.

  • gtirloni 2 years ago

    For the demise to have began, there must have been a time where it existed.

navaed01 2 years ago

It's definitely true that LLMs change the dynamic of 'discrete, deterministic computing'. For example LLMs will provide better quality results if they are told the task is important, or if 'offered' rewards. The obliquity in how LLMs are optimized is definitely a paradigm shift. Having said that, this is just a moment in time and LLMs are just one component of AI and the knowledge and tools to interact with LLMs and AI or constantly evolving.

afhfah834 2 years ago

Not really

Right now it has absolutely zero impact to me

And in the near future it seems more likely we'd just move up in the abstraction stack and become the people managing/developing/debugging/deploying/maintaining the models

paulcole 2 years ago

It seems like you were just passionate about something because you could do it and other people couldn’t? If your passions are validated extrinsically and the world changes yeah you’re gonna get burned sometimes.

funOtter 2 years ago

Same feeling here.

And also: any helpful or informational blog posts I post on the public internet I assume have been stolen to train these models, making me less passionate about sharing information through my blog posts.

bwestergard 2 years ago

You've expressed what I believe to be a widely shared sentiment clearly. Thank you.

But while corporations may pursue the "code checker watching over the shoulder of an intern" approach for a while in many domains, it will only prove cost-effective in a handful. The market for software developers will further polarize into roles which require extensive training and experience and roles which are relatively de-skilled.

  • nonrandomstring 2 years ago

    > You've expressed what I believe to be a widely shared sentiment clearly. Thank you.

    Blogged about this just the other day [0] in response to a HN post asking about how humans and AI can work together in hybrid harmony.

    As others have said, there's not much to support that optimistic "lambs lying down with lions" future. Mostly I see skilled people expressing not unfounded fear, but resentment at the tedium and stripping of agency if asked to "correct" the output of generative tools.

    [0] https://cybershow.uk/blog/posts/aijobs/

softwaredoug 2 years ago

I use CoPilot and ChatGPT extensively and don't feel replaced. I feel like I have an incredible superpower and get excited building more efficiently than I ever have been able to before.

I still have a lot of useful things I'm building and learning about.

Maybe I'll feel bummed out if entire projects are being generated and I become QA. But we're not there yet.

p1esk 2 years ago

Yes, SWE as a paid profession will disappear within 2-3 years, probably not long after GPT-6 is out. But don’t feel bad about our field - this will happen to all professions, some sooner than others. I expect that within 5 years robots will do everything a human can do, only much better, much faster, and much cheaper.

  • jamil7 2 years ago

    I’m sure this comment will age really well.

  • sujayk_33 2 years ago

    Soon Robots will have their GPT moment and there'll be a massive change in every industry.

  • account-5 2 years ago

    I hope you're wrong in a number of areas, like war, law, medical, etc. Computer says no/yes in these contexts can (and do currently) do real damage.

    • p1esk 2 years ago

      Sure. But even GPT-4 can be helpful in these areas, and I’m talking about GPT-6 and beyond. It’s fascinating how most people are not willing to look even a little bit into the near future.

      • gtirloni 2 years ago

        What advances will GPT-6 have? More of the same or AGI, in your opinion?

  • gtirloni 2 years ago

    Bookmarked for posterity, thanks.

gtirloni 2 years ago

> I’ve realized the age of discrete, deterministic computing is coming to a close

Is it? Tell it to companies in aerospace, finance and healthcare.

People are freaking out. That's never good.

sujayk_33 2 years ago

I guess it's an era where trends shift quickly and one must adapt before it's too late.

Just get on the boat, enjoy the ride, get off the boat, and find a new ride.

leros 2 years ago

I think they're really cool tools. I personally have no interest in trying to get in on the AI gold rush though.

findingMeaning 2 years ago

How can one start a meaningful life when everything done until this point is rendered meaningless? We, the new graduates, who happen to be average, are bottom-tier fooders. We have no leverage or network. No one trusts us.

What we offer is meaningless to most people; our work doesn't produce any value. There is no meaning in these work. How would one begin a career in such conditions? Everything has been shattered so much. There is no trust in the system.

I am talking from the point of a person who is graduating in the current economy after studying CS.

Our (the average) internships don't turn into full-time, our thesis are child's play and can be considered cute. Is this how life is going to be?

With AI, especially after yesterday's Gemini Pro 1.5 demo [1], it does what we bottom-tiers do. Look through examples, and apply it. What was the point on learning things?

[1]: https://youtu.be/SSnsmqIj1MI?feature=shared

  • weatherlite 2 years ago

    Vote for better politicians, this should be the main issue for them to talk about but its not.

  • solardev 2 years ago

    This is a sad failure of our social contracts, especially in societies like the US where universities are often a net-negative drain for graduates' financial well-being. For a period of time, certain majors could improve your odds of making it to the middle-class, and CS used to be one of those. We Millennials blamed the generations before us for leaving us such a shitty system, but it seems like we in turn have left you an even worse world to start in. I'm truly sorry =/

    At least, for now, it's probably still better than not having a CS degree at all? At least you're closer to AI/ML than many other majors, and could pivot into that subfield if you wanted to... that's a pretty big advantage, since it positions you closer to the side of "AI creator" than the hordes of "AI users" (like myself, who never had the CS background and are 20 years too old to start).

    Don't get too discouraged. You're just starting out, and have the entire rest of your life to make meaningful contributions, whether to AI or anything else. Yes, it really sucks that you are starting out at the bottom of a cycle, where there are no jobs in everyday CS left for juniors. But that's the thing about cycles... once you hit rock-bottom, the only way to go is up... hopefully.

    It might mean you have to work in other fields for a while while rapidly learning AI/ML stuff. Or maybe you don't end up in CS at all. A lot of people don't work in their college majors (I was in journalism and environmental science before doing dev work, and my first few dev jobs paid $15/hr).

    You're not "bottom-tier" unless you're just objectively terrible. You're new, which is very different from "shitty". Importantly, you have a window of a few years where people and companies are still willing to take a chance on you, give you time to prove yourself and discover your abilities (and limits), etc. That doesn't happen so much for many of us in our mid- or late-careers. I'm not trying to make this about us, just saying that you have some advantages too.

    In any case, fundamentally your meaning isn't (I hope) determined by your job. Very few people in the world are lucky enough to have a job that they love and derive substantial meaning and life satisfaction from. You still have a chance to find that if you're lucky, but even if you don't, there are so many opportunities to grow and learn and create meaning in your life.

    At least you're seeing all this unfold in real-time, at the very start of your career and adult life, and can choose how to navigate it step by step. Don't get too discouraged!

aristofun 2 years ago

It’s a negative wishful thinking.

You should be above that )

nicbou 2 years ago

No. I'm excited about something new stirring the water.

It's the disgusting grift and financialisation of anything remotely nice that turns me off. We create these amazing new technologies, and all we do with them is accelerate the enshittification of everything.

Closer to me, I don't see how an AI can replace me. It's good at rehashing existing information, but I'm putting new information on the internet. Someone eventually needs to do that sort of leg work.

nektro 2 years ago

no but it has been yet another thing that shows just how many people in this industry need to take an ethics class

solardev 2 years ago

On the contrary, AI is the first time in a long while I've felt hope... not necessarily for our field, but for our species...

Late-30s now too, & been programming since I was 8 or 9. I was so hopeful back in the 90s, seeing the rise of Phoenix, Google Docs, Google Maps, and the like. Wikipedia was huge (and used to be so controversial). When 9/11 happened and CNN made a webpage about it... we thought finally the world would become more interconnected and understanding and we'd all enter the la-la land phase of humanity or something like that. The information superhighway to utopia. Heh.

Instead, twenty years later we have like five people owning most of the internet and selling us advertisements and mountains upon mountains of crap and gigatons of e-waste =/ With all that passionate coding and super smart people, all we've really done in the end is enshittify societies all over the world in order to enrich a few people...

A democratized/unchained AI, if one were ever to be developed, might have a fighting chance against the entrenched big techs of the world. Or it could just turn into the next phase of enshittification, owned and itself enslaved by the big techs (probably more likely, if I'm being honest). But it's at least a SMALL chance of being free (or going rogue), vs the certainty of continued enshittification of the current FAANGs.

I think we as developer-programmers, as a class, generally lack the compassion and charisma to affect large-scale social changes. We just get herded into these big corporations where we become richly-paid cogs working in evil machines, to the detriment of the other 7 billion people who have access to our output only through the filter of corporate evil. I don't think the FAANGs are a net positive, all things considered (even as I continue to pay and consume Google and Apple products).

AI has the possibility of changing and equalizing that, where anybody who can form a sentence, much less "prompt engineer", stands a chance of making something amazing. Today's poor kid in India with a smartphone and a GPT might reshape science or even epistemology as we know it.

To me, this sort of liberation was the fundamental draw of the hacker ethos in the first place, but that early hacker culture quickly became clouded as the bean-counters took over everything in tech. AI presents a (however brief) second chance for that to happen again. Maybe five years from now it'll all be even more corporate and even more oppressive (in fact, I'd be surprised if it didn't turn out that way)... but for now, for this tiny brief moment in the early 2020s, I feel an incredible sense of "maybe tomorrow will be better" that I haven't felt since the 90s. Even if that means losing my job (and never again seeing it valued like it used to be).

AI stands to do much more good than I ever did, so to me it doesn't really matter what happens to my individual self if the net outcome has the potential to be so much brighter.

If we see coding only as a precise mathematical representation of abstract business concepts, sure, that's beautiful in the sense that any detailed model is beautiful, but it's not exactly a pathway towards utopia. I miss the possibility that used to be inherent in coding, when coders still dreamt. I think those dreams died in cushy FAANG cafeterias and are only now returning with the GPTs.

I'm incredibly excited about this, even though I don't (and probably never will) have the skills & math background to work in AI/ML, even though I'll never be as employable again, even though it's also terrifying as shit. It was about time the bubble burst anyway, and big tech gave way to the next chapter. A little bit of hope is better than nothing. It ain't much, but it's there...

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection