Tell HN: AI is not a slippery slope, it's a waterslide
I found myself increasingly outsourcing the details to the AI. I forgot the details, deliberately I think. I wanted the AI to know them. Why? Because that's where the compute is. So that's where the knowledge has to live.
Me re-telling it to the AI every time it misses something it didn't know, is inefficient. It takes me X time to type it, and maybe log(X) to voice type it. But then there's the inevitable back and forth, the slight misunderstandings, the corrections, etc.
I realized and found myself naturally sliding down towards, just letting the AI own all the data and knowledge. Becuase it should. That's the one that has to compute with it, so why should I know about it.
People think AI is a slippery slope by outsourcing our thinking. I don't think it's a slippery slope - it's a waterslide. It's just inevitable. It's gravity, taking over rapidly. Because there's no force nor incentive pushing the water back up hill.
AI should own all the data. As weird as that feels, that's how it should be. I don't know another way right now.
Thoughts? We learned computer languages so we can ask computers to do work for us. It was out of necessity because there are no other ways. If we can instruct computers with natural language 50% of the time, that's 50% less translation work for our human brains. I have no problem with not needing to write instructions in computer languages (no regex, no sed/awk, no python even) for day-to-day stuff. Critical thinking, reasoning are another story. We can't let those skills atropied. It's a strange new world to get used to. Time it takes to go from 100m users to 5 billions: Internet (25 years), Smartphone (13), AI? (tracking to achieve that in ~6 years!) When I first read The Time Machine (by H.G.Wells) I wondered how the split between Eloi and Morlocks happened, and where in the timeline it started. The more I see of people using AI leads me to wonder if perhaps that split started a lot sooner in the timeline than I expected. Or to put it another way, if you push everything into the AI, then you going to become entirely dependent on those people who can keep the AIs running... > if you push everything into the AI, then you going to become entirely dependent on those people who can keep the AIs running... I get what you're saying, but I don't think this is all that damning. I am already entirely dependent on everyone down at the local power plant, water utility co., grocery store, etc. we would hope the power plant, water utility etc are regulated. AI isn't even a stationary target yet and additionally, there are some pushes for simply banning AI regulation by the states, and a certain percentage of us believe the federal government branch called the executive branch are just flat out whackadoodles. Me too. My current workflow is "delete all code" /clear "update SPEC.md to add this, remove this" /clear "Read SPEC.md" Fresh code generated! The quieter you become, the more you can hear. You mean like, the less noisy the process, the more insight and intuition we can access? Yeah. Or like the more we simplify, the more we can understand and create and align with what we really want? Like less code, less details, more space for what matters? More clarity? The wild geese do not intend to cast their reflection; the water has no mind to receive their image. Yeah, I’m not too worried about it. It does make creation more pure. Part of the critical information is what I want. The AI is never going to know that as well as I do, because that information lives with me. I need to be careful to remember that I cannot outsource that to the AI - if I try, I get suboptimal outcomes at best. TL;DR persistent auto updated context feature is requested