Why do we hack?
curious.galthub.comI think the "brutal pragmatism" and corpo SWEs have always been an opposing force to the hacking spirit. Catch phrases include, but not limited to:
- don't reinvent the wheel
- right tool for the job
- use a library
- time to market
- customers don't care about X
- premature optimization
I hack because it's the expression of the exact freedom that made me interested in computers in the first place. Discouraging developing engineers from exploration & research and forming them into a product-oriented mold is something I really, really don't like.
Except for one, all of the above are good bits of wisdom, not reinventing wheels, using the right tool for the job, reuse of libraries, quick time to user - to get feedback. All great engineering - whether you are building for the masses or just yourself to scratch an itch.
But one that cuts to the core of a deep problem in computing and sticks out to me is;
That's so broken it's not even wrong. It's a just profoundly ignorant and arrogant misunderstanding of the world and what we do as programmers."customers don't care about X"Developers have never known what customers want. Customers don't know what customers want. Because software is not a supply-demand business based on reasonable a-priori expressed preferences. It never has been.
For the most-part people accept what they get and define their understanding of technology and its possibilities accordingly. The limits of their horizons are arbitrary, the product of fumbling evolution, fads and fashions, science-fiction and fantasies, endless copying and reconfiguration of features, educational pre-requisites to access and understanding - and all happening within a rapidly changing world of social and hardware change.
The conceit of the programmer as a "master-chef", lovingly creating a dish to the exact delectation of a discerning customer is nonsense, and I have always taken pronouncements about "what customers want" to be naive, grandiose and out of touch.
Indeed, customers don't care about any of the details that go into the solutions we provide. Except the one that faces them: the UI (and related UX.)
Right? You could theoretically/evilly create a whole guide to "how to change a customer's mind on what they want."
It's worth reading some of BJ Fogg, oft painted as an "evil genius" singlehandedly responsible for digital surveillance capitalism. In reality he was/is a leader in understanding how, because UI/UX defines the reality and expectations of a user, it can itself influence their desires and expectations. He called that kind of social engineering through technological design "Captology".
> premature optimization
This reminds me of my first FAANG interview, about 2 decades ago. I was already accomplished, and I assumed it would be a normal software interview, which pretty much always went very well. But once in the interview, they wanted me to write some code on a whiteboard, which I'd only seen once before, and to talk as I'm doing it. OK, I'll try that. (I actually hadn't slept the night before, for unrelated reasons, and I'd unknowingly eaten something that makes me feel queasy, but I knew my way around whiteboard analysis and design, and I guess I could code on one.)
Right before then, I'd been working on an (introvert-powered) meticulous coding style, so I adapt that style to this unfamiliar interview format. I was time-slicing my attention between focusing on the code, and explaining my thinking process for some stranger's mental model (plus the social context awareness overhead).
While I'm coding on the whiteboard, I see that I have a variable that will end up getting set redundantly sometimes in a loop. As I quickly correct that, I verbalize what I'm doing.
And he interrupts me with an order, "Don't prematurely optimize." No clarification -- like that he thinks I'm barking up the wrong tree on algorithm approach (plausible), or that he doesn't know that meticulous coding can be super-effective and not a barrier to also seeing bigger picture, or that he just knows "premature optimization" is a thing. I should've asked, but I was thrown by interruption of my flow, and the tone of voice.
In any case, that set the tone for the interview, and I didn't get an offer, somewhere that otherwise seemed made for me. I ended up going instead to a career path much less lucrative than 2 decades of FAANG, so that might've been an extremely expensive waste of whiteboard marker.
Today, people who use the term premature optimization had better be using it well, or I will secretly think glares at them.
I feel for you. I'd probably be just as distracted as you were if this happened in the middle of an interview, but in the off case I weren't, I'd probably talk the interviewers' ear off. What they seem to not understand is, this kind of correction is natural and perfectly normal when you're writing code. I do it all the time - I think of a small piece of code to write, I write it, and before I'm done I notice obvious improvements, which I do on the fly.
It's not "premature optimization", as it costs me nothing (on the contrary, not doing it will cost me my focus / peace of mind), improves performance of the code - or rather, avoids stupidity - and often improves readability, as the code is now closer to a "pure" description of what it's supposed to do.
I've long held that a big part of why software today sucks so badly is because people don't do this, or likely can't do this. They write stupid code, and never bother to learn how to write non-stupid version of the same code by default.
I suppose it might be more normal for C and C++ programmers to get in the habit of meticulous thinking through the code as we write it, since that's more a necessity than in most other languages. (If we're not careful, we risk having to pay with painful debugging sessions, and expensive correctness/stability/security problems.)
But meticulous coding is still useful for nontrivial code in any language that we want to work well. And it's not that hard; it's just something we practice and get better at over time.
This is the same process that I follow. I think the best developers I've met are constantly going through this iterative approach as they code, instead of leaving it to come back to. Besides not having the time once a manager sees a working model, I think this makes you a better developer and allows you to focus on the problem more clearly with a "clean" initial solution.
Oh man! You've nailed it with this comment. I am that senior corpo SWE you describe, and I confess that I push a lot of these ideas. I have to ensure my teams stick to business critical paths but I do encourage things like...
- If you come across something you don't understand, don't just unblock yourself with stack overflow or asking me, try to grok the problem. Take time out to play with the idea.
- Try to understand what you're doing, never cargo cult program. Dig a level deeper than you need to. Foster your curiosity. Understand HTTP now? Great, now look at TLS and TCP.
- Focus on quality of life scripts and tools. Don't ask permission, if a bash script will save us time to do repetitive task X, do it. Resist the constant pressure to deliver the ticket quickly at all costs.
- Do you really need that library? Try as hard as you can to avoid it, even if it adds a bit more work.
- For 10% time, free you mind to work on things and ideas that excite you. Don't let the perceived business objectives pressure you.
Etc...
The truth is, it's a balancing act but I do very much agree with the sentiment of this post. I struggle with the cognitive dissonance of having to think in these two ways at once, and push, as hard as I can justify, away from the 'ruthless pragmatism' route. As ever, it's all difficult trade-offs.
From a business point of view, if you have to sell it, the above produces better engineers and, therefore, the business benefits from higher quality and velocity ultimately. Obviously there is an upfront investment...
I agree with everything you said, and I share very similar views.
However, I have become a bit more defeated and jaded. I have honestly just come to the realization over time that if I want to scratch my hacking itch, then I will just have to do it on my own time with my own projects. My employer and coworkers do not care. Much like others' experiences, my employer actually frowns upon such ventures.
My employer wants things done as fast as possible, with whatever library, with us not reinventing the wheel, etc.. That's what they pay me for, and I suppose that is what I am obligated deliver.
I feel a lot of empathy for artists and musicians and the struggles and compromises they have made in order to "make it." I feel like us hackers must do the same.
"don't reinvent the wheel"
The single worst piece of advice that I consistently see repeated.
If you're product is a bunch of APIs, you're a user not a maker.
Then, where does being a "maker" even start? One will always end up using something someone else created, or discovered, whether it is a library, a language, an OS, hardware... Even if you build your own computer from scratch using materials you gathered, you still did not "create" the raw components, did you?
All of this is only but a paraphrase of this famous quote from Carl Sagan: "If you wish to make an apple pie from scratch you must first invent the universe"
Ironically it takes some pragmatism to accept that this is an instinctual thing. If you saw that a baker was just reselling baked goods from a supermarket you would not call them a baker, but you would be happy with them purchasing flour.
Being a "maker" starts when you want to existentially vanish whatever is providing you the interface, whichever that is, from copper wires, to a hacky Chinese PLC software which works only in Windows XP SP 1 with the manual translated to English by a word randomizer, to a fancy GraphQL API. And if your current interface is not hurting you, making you to want to hurt it back, remove it from existence altogether and start from scratch, one layer deeper in the stack, well, you are not a maker. And that is fine, enjoy your interface.
I agree with this. The inevitable conclusion of my focus is a bare metal implementation.
Manufacturers value supply chain; so too should tech. If it's not your advantage, it's likely a disadvantage. Own your stack.
Proposal: Being a "maker" starts when one's addition —the stone one brings to the soup— is both novel and useful.
(the systems integrator may be doing something useful —certainly something lucrative!— while hardly novel; the shade-tree github'er probably has a directory full of explorations that are novel but not —at least not yet?— useful...)
IDK, I definitely don't consider novelty a requirement. Someone who makes a chair or a machine like https://www.youtube.com/watch?v=EgTnfTqycV0 is clearly a maker even if they're just making what many others have already made before; arguably the embodiment of the maker spirit is making something yourself even if someone else's solution is both available and more useful.
Is Frankenstein human? Configuration is not creation.
Whether Frankenstein is human or not is the question directly asked by Mary Shelley's book. As in our "making or not" problem, ironically, the answer is neither black nor white, as the monster is alive, yet made of dead flesh, and can be cruel, yet display human-like sensibility.
My point is that the line is blurry. "Configuring" a system with a large amount of parameters can be considered "creating something". A good example is how the knobs on a synthesizer can be tweaked to achieve a peculiar sound.
This is an interesting point. I believe the example provided might offer additional insight if rephrased.
Does the producer create music or simply refine it? Does the answer change when the recording artist is also the producer? What if the artist is not the producer or composer?
'Don't reinvent the wheel' is a big reason I've put off almost all side projects that I've wanted to do. Everything I've thought of, someone else has already written, and it's quite demoralising. I end up thinking, 'what's the point?'
If you cannot yourself come up with a truly original idea, consider this; There is always the opportunity and option to do a thing better than it's been done before. Choose a thing that excites you, but you see possibilities for improvement, or a unique way of doing it that may be better than those who have gone before. "Stand on the shoulders of giants" if that benefits you, but do it in the way that only you can do.
If you can't reinvent the wheel on company time, I would urge you to try and do it on your own time. You don't necessarily have to use the solution you come up with, but it helps to get a deeper understanding of the code beneath the libraries you use. Using this technique definitely made me a better developer when I was younger. I still like to take small portions of libraries I use and try to code them myself, just to get a better feeling for how things work (even if the code is vastly different from the library I use).
> on company time
Heh, I wish I had 'company time'. I am a final-year undergraduate with a mediocre GPA and no job offers. Ergo all my time is 'own time', and most of what I do is interview preparation by grinding Leetcode problems.
Keep in mind economic leverage is never a frontend feature. Gold is always buried.
And what's the problem with that? You're a technology integrator, solving problems and getting paid. That's what engineers do, and customers want robust and maintainable solutions more than creative ones.
Making things is great, but some problems already have solutions everyone mostly likes.
> premature optimization
I don't get it. Even when I'm working on a hobby project of no practical value with no deadlines. Premature Optimization is something I'd still like to avoid. I feel like PO harms code maintainability. It's far easier to optimize bottle necks in well written code than do refactor code complicated by PO in my humble opinion.
Edit: Newline
Premature optimization should be avoided. It's in the name. But that wasn't really my point.
The problem is that PO is used as a catch-all, low-effort statement to dismiss performance questions and exploration. It's used by devs to antagonize curiosity.
>Premature optimization should be avoided. It's in the name.
Right, the word premature is what makes it a bad thing. What's missing is a complementary concept, let's call it early optimisation. Sometimes you can optimise a thing early on without loss of generality, increase of code complexity (sometimes, with a judicious comment at least, optimised code might even be simpler) or incurring technical debt. The problem is, as there is no awareness of this difference, early optimisation is often misconstrued as premature, even when it really isn't.
I mostly do hobby projects like procedural generation, and I explore stuff I don't understand. As I gain knowledge I refactor and change the code to reflect the new understanding. This new understanding of the problem domain will most often result in performance gain, because the code is much more focused.
I especially love it when people with no knowledge of your code or the systems around it presume to know where the bottlenecks are and tell you to optimize a piece of code nowhere in the vicinity of the critical path. A huge part to optimization is realizing that you have limited time and you want to focus on optimizing the stuff that matters, which means if anyone ever gives you unsolicited advice, they're wrong. Until they've profiled your code, they have no idea where the key optimizations can be found.
I personally make a sharp distinction between my professional and personal work. Personal time is for "hacking", exploring, playing around with stuff. Professional/work time is for enriching the bossman as fast as possible - so using stuff other have made.
Of course. Very few people get paid hacking - nevermind generally for things they enjoy.
100%
"Product" is the problem here, right? A product is a
1) Finished complete shrink-wrapped thing
2) that someone else can't have unless they give you money for it.
Both of these characteristics strike me as part of an incredibly bad model for "making good software."
True. The other problem is that many companies have neither the capital, nor the risk-tolerance, nor the expertise in building and maintaining an R&D department. That's where the "curiosity itch" gets scratched, and where hacker and corporate interests are pretty much aligned.
That's funny, I don't experience it like that at all. Hacking and exploring is fun, and I did a lot of it when I was younger, but if I'm trying to get things done - which is what I'm doing 95% of the time, be it for work or for a personal project - I rather like to stick to those principles. And the fun comes out of delivering the product, completing the machine.
Maybe it's the result of too many years of freelancing.
Tiger got to hunt
bird got to fly;
Man got to sit and
wonder 'why, why, why?'
Tiger got to sleep,
bird got to land;
Man got to tell him-
self he understand.
Best way to confirm understanding is via play: if I do this, that should happen...Because we can't not hack. I've always been interested in writing, and read a bunch of "how to become a writer" stuff. I read a piece with some great writer answering the Q "how do you know you're a writer?" Answer: because you can't not write. I'm perfectly OK with not writing. But not coding? That's another thing...
Not for money, not for recognition, not to please anyone, because I don't know how to not hack at anything that seems interesting (including things outside the domain of computing).
"Why We Fight"
Has anyone devised any sort of “eras” or “generations” in the history of hacking? What stage are we in now?
Because we don't have a girlfriend.
I know you are joking, but I think it's important that nobody feels excluded by assumptions on who "we" are, even in jokes.
"we" are not necessarily interested in having a girlfriend. We could be interested in having boyfriends. One or several. Or none at all. Or someone not specifically characterized by a gender. Or by another gender not listed here.
> I know you are joking
Genuinely curious, then why not leave it be? It's just a joke as you admitted, with no ill intentions, based on a stereotype that's mostly based in reality. Stem fields are mostly sausage fests. It's also quite a stretch framing it as "exclusionary", it's a self deprecating joke.
Jokes of this kind are funny because they reference an unstated truth that performer & audience know. They are one of the vehicles by which culture is created. If you think that a joke is communicating a mistruth (by which I mean something disputed, I do not mean a lie) or creating a negative culture - regardless of the intent of the author - it's okay to discuss that.
So in this case, I don't think there's any reason to believe there was an exclusionary intent, and I don't interpret GP as saying that or as calling GGP out, but there may be an exclusionary effect, and I give it's totally reasonable for GP to respectfully add this footnote to mitigate any unintended negative impact.
Some may say this is being a wet blanket or ruining the humor, but people always call you paranoid or a safety Sally or whatever when you try to mitigate unintended harm in any context; you learn to ignore it.
I used to be involved in a biology lab, and people loved to stand on these rickety tables that fell apart at the drop of a hat. So I would bring them a stepladder and politely insist that, if they were in my lab, they were going to use a stepladder. They called me a safety Sally everytime. Doesn't mean I was going to let them smash their heads on the cement floor.
In your example you are replacing the labmates dangerous solution with a much safer one. How would you replace their "dangerous" phrasing with "safer" phrasing while still retaining the slight?
Well, this is a more literal interpretation of the metaphor than I intended, so allow me to clarify. I don't mean to police people's phrasing and the connection I was drawing was about unintended harm and not safety and danger. In my mind, adding your thoughts to a thread is completely different from editing someone else's comment. Telling someone they should've made a joke a certain way seems a bit much.
I tried reframing your question to, "how would you have made this joke, preserving the slight", but the joke isn't to my taste personally, which is only a problem if you ask me to rewrite it - I'm not expressing this as a criticism of the joke, I'm just the wrong person to ask, because when I try to imagine how I would've made the joke, the answer is I wouldn't because it isn't my sense of humor.
I don't really have a problem with the way events unfolded with joke being made & getting a footnote. That honestly seems fine to me, I don't think it requires tinkering.
"George M. Jones"? Sure that isn't meant to be Stephen?, aka smj from SDF and LCM fame.