Ask HN: Looking for programmers who don't use and don't want to use AI
Hello everyone,
I would like to know if there are programmers, communities, conferences, companies, forums etc, where people deliberately refuse to use any forms of AI tools for software development.
I'm looking for the same-minded people who are willing to keep developing hand-made things, and don't want to use AI tools at least in personal projects. If there are already existing communities, I will be glad to join.
What I mean by not using AI tools:
1. It means that you don't use it at all, not even GitHub Copilot for code completions. Everything you create, you made by hand.
2. If you are using AI for something else aside from your personal projects - it is ok. For example, if you are requested to use it on your current job, or like to discuss with ChatGPT your cooking recipes on weekends, it does not count as using AI for software development.
To clarify, I'm not against AI. This technology is marvelous, and it is totally fine if you are using it. I'm just looking for people who don't use it, because they don't want to. Someone called for me!? It's deeply unfashionable to believe in human potential, but, well, I do. I am one of the last working (human) hackers. See: https://paulgraham.com/hp.html. My reasoning in not using AI (at all) is many-fold. First, I am a powerful learning model, the most powerful one I have access to. This model is so powerful that to reach its true potential I must continually feed it huge amounts of learning experiences and training data. I must seek out other unique perspectives and try to understand them. There is virtually no end to the amount of data this model in me can process and integrate using tools like "philosophy" and "logic". Second, it seems to me that models have shown us that the human limitation is not our ability to think, but how fast we can type. The model-first people think that means that we must find a way to offload all our ideas to agents, since agents are on the other side of the low-bandwidth interface that is the keyboard. I just think we need to arm the human with more expressive tools: an instrument, not an assistant. Third, most people have stopped believing in the potential of great software. They seem to think that the tools we have now are the best that can ever be made, so by their reasoning there wouldn't be much point to getting good at building new things when instead you could learn to copy what already exists. If you're interested in the tools we're making to mint a whole new generation of (human) hackers, I would urge you to come check out the BABLR Discord! It's where we work on our tools, talk philosophy, and boot anyone who tries to sell us their AI shit. https://discord.gg/NfMNyYN6cX Every line of this code in the BABLR ecosystem was written by human hands too: https://github.com/orgs/bablr-lang/. In five years of full-time OSS work I've built a streaming regex engine, the world's most powerful streaming parser, a language-agnostic AST/CST format, and a data serialization language that will eventually replace both XML and HTML: https://docs.bablr.org/guides/cstml are you against technologically modifying your body in order to become more productive? if no, then why not do that? Well I put a ski straight through my ulnar nerve once and a surgeon had to splice nerve lining from my wrist into my elbow. I'm also currently wearing technological augmentations on my wrists. I call them "the gloves of coding" and they help me not succumb to carpel tunnel. And yeah, I spend all my time building developer tools, which are a technological means of augmenting your brain. I just happen to focus on technology that is deterministic. what about directly connecting two "powerful" human learning models directly? can you do this without disrupting your determinacy dynamics? Wouldn't that just be two humans having a conversation like we are now? Or beyond a conversation humans build relationships -- I do that too, and those relationships influence and guide me. I cherish them. can your developer tools exponentially increase the subjective/internal bandwidth between the participants of a conversation? if yes, how? by climbing the ladder of abstraction are your developer tools capable of implementing telepathy? enable the exact recording of sleep dreams? well I have to say that that wasn't one of my goals, though I do sometimes have ideas in my sleep and do them there's a gigantic market. believe me Perhaps there is, but I'm not a neuroscientist and also that would be a hardware startup. I write code by hand in part because i need to do the work to know how to make good tools for doing the work. I'm against AI. I've written two books on the topic so far. I also use AI, as I must if my software company is going to make money. There is no choice any more. If I had a choice I wouldn't use AI for anything, ever. It's a blight on humanity. But it is a blight everyone has chosen so we are stuck with it. I'm not sure I'll be fully anti-AI in perpetuity: the future is hard to predict. But it's certainly becoming clear to me that we need "-noai" variants of programming communities. Virtually everyone on Mastodon is against generative AI care to explain the discerning characteristic of the term 'virtually everyone' as compared to synonyms? what's wrong with using AI as a language-extrapolation tool specifically, for people who absolutely forego AI in personal projects? is it that "language is God-like" and therefore cannot be extrapolated artificially? I'm not quite understand what "language-extrapolation" actually means. Personally, I don't use AI because I like to program myself. There are many other reasons, but this simple one dominates. Programming with the use of AI can still be seen as programming but at a different level of abstraction. Depending on your "job to be done", you will prefer one level of abstraction or another. Example from before AI: I've always hated javascript frameworks like nest.js because they were doing too much magic under the hood. But for a simple CRUD application in MVP, I'd might use it. From the purely utilitary position maybe you are right, but my point was about arts and entertainment. Using the content generation tools might be useful in certain situations, but it's not joyful when the robots do the job for you. Imaging that you purchase a video game, but then use LLM to play this game for you. Another example is Chess. Stockfish is more efficient than most Chess players, but playing Chess using programming assistant (even a little bit) is no longer a sport competition. I also agree that not everyone like programming, and see it just as a job to be done. ow I see the arts and entertainment point, and it makes sense! But I'm now wondering why programming with AI doesn't feel like art (which is totally true to me at this point)? For me I'd say the answer is related to feeling in control. Either:
- I don't know enough about using AI to code and feel I'm in control or;
- the tools are not at the level I need to feel like I'm in control language extrapolation is extrapolation applied to language. example: you try to come up with the name for the most promising next tool (as in, a concept), as you personally judge how it should be named best. (IF YOU KNOW HOW TO TALK, you should have no problems with this.) you ask AI to explain that term. if AI misses significantly, you make your term more precise. pro tip. inspired by my learning style. example can be seen in my latest submission i like to program myself by hand too, but you can use AI for programming others. exactly this, doing programming as a Hobby is very different than from a money-making perspective, both aren't incompatible. After all, people do many things that are literally "useless" for fun (and most know already there is a better way to do practically anything they are doing, same as cooking), there is no reason it doesn't apply to programming, however, not using AI nowadays for anything business-related is counterintuitive. what is the economical foundation of the entity/phenomenon "businesses"? why do you think that it is counterintuitive? Because I have myself 10-50x my productivity and I have close to 20 years of experience in dev, and I'm able to manage 50 repos at once (with heavy adversarial in loops and many other automated procedures e2e, regressions and everything you can think of), spent about $50K in tokens lately (+ many subscriptions). I also have a grid of 8 monitors where windows are automatically switched to ask me questions non-stop, I'm a "bot" yes :p, but I've never been able to even dream of managing projects this big while maintaining consistency and quality. I say it's counterintuitive in the sense that for a business, the "sheer" output now possible, not just in code... but in marketing automation, documentation, testing, prototyping which brings you competitive advantage that just can't be compared, I understand that you can still make money without AI, but if you can, you can also make so much more money with AI, it's not only about the money, it's also that you can solo handle complex projects (assuming you have the mental bandwidth to manage 30 threads in parallel). I completely respect the OP's position for personal projects and hobbyist work but I don't see how you can make more money without using AI, quality of code (but even this, I'd argue that 20 rounds of adversarial with 10 models is not really beatable) is rarely a success factor in business. Well, I can ask you the same question. I don't see how you can make money using AI in the foreseeable future too. Even if it is possible today, once the critical mass of people will master this technology, the opportunity gone. In the past there was a "human computer" profession[^1] where people earn money for calculating by hand. Soon they were replaced by just normal computers. But computers were rare and expensive. Not everyone could afford of having one. So, there was a period of time where the big business had an opportunity. This opportunity also gone, once the home computers widespread. The businesses may be in advantage of selling products made with AI. But I foresee that the value of these products will degrade eventually, because everyone will manage to do more or less the same at home. Ordinary software will no longer be a thing you can sell or advertise easily. You would have to try harder to produce something unordinary. And even if you do, it will be very easy to replicate for competitors using AI. I think that maybe not today but in a few years the Internet will be full of generated content most users wouldn't trust, and wouldn't be interested in mostly. And there will be no novel ideas available for public, because any know-how will be carefully hidden. To conclude, it's not just about "code quality". If you are making something unusual you can make by hand, that the AI cannot imaging, you have very serious reasons not to disclose it in a new reality. I actually agree with you, the future is uncertain and even worrisome and many ways (and bleak :/ goodbye Internet the way it was), but the 1-2-3 years window now is real where practically anyone with motivation & skills can succeed. wait till you conceptualize "hand-made AI"