The Value of Things – journal.stuffwithstuff.com

14 min read Original article ↗

One of the reasons I write is to help me organize my own mind. I have a compulsive need to figure things out and I’ll lay awake at night shuffling sentences around in my head until it hangs together. Then I just have to try to get it all down in Markdown before it dissolves back into chaos.

Like a lot of people these days, I am losing a lot of sleep over LLMs and generative AI. I mean that literally. I’ve had nights where I tossed and turned worrying about whether AI is going to destroy a career I love or wreck modern society until the sun comes up.

I keep hoping I’ll sort it out in my head enough to reach some inner peace. I’m not saying I will literally figure it out, like, for the world. But I keep hoping I can at least figure out my own relationship with AI.

I’m still not there, but I was able to pull one thread out of the tangled discourse around AI and get it to make sense in my head. I hope it will help you too. Or maybe this is all nonsense and you’ll point out the myriad ways in which I’m wrong. That’s a possibility, but I’m trying to be more courageous in my writing, so here I am.

Anyway. Generative AI is about using machine learning to produce digital things. For AI to be a net good for the world, at the very least, the things it produces must have value. But what does it mean for a thing to have “value”?

Utility#utility

The obvious answer is that a thing has value if it’s useful. If it solves some material problem in the world or at least creates some joy, that’s value. In the UX world, they call this “utility”: the thing a thing actually does.

An apple has utility. You can eat it and it will give you some calories, some hydration, a little fiber. If it’s not an oxymoronically named Red Delicious, it might even bring you some delight. Material objects have a natural sort of utility from their physicality, but digital objects can have utility too.

A while back, I took an interest in audio programming. I read a lot of articles online to learn some of the math and algorithms behind digital signal processing. That information is useful. Before I read those articles, I couldn’t program an FM synth. Now I can. That utility is just as real whether I got it from reading a classic article written by Julius O. Smith III, or from asking ChatGPT.

In both the physical and digital worlds, utility really does matter. One possible anthropological definition of “technology” is anything that enables humans to more efficiently generate utility. It’s central to human progress.

The reason I can go to a store and buy a bag of apples is because thousands of years of agriculture technology development means humans can very efficiently produce and transport food. And, you know, we need food to live. Food is cool.

My career as a programming language designer is only possible because of all of the material I found online to learn how compilers and languages work. Hobbies like knitting, making music, and cooking are deeply important to me. They’re possible in large part from learning through YouTube videos. Those articles and videos were truly useful to me.

Generative AI can help improve this. Here’s a concrete example: I got curious about how AI is affecting software jobs outside of my weird tech company bubble so I started poking around job listing sites. I stumbled onto a software engineer position working for the Washington Department of Ecology. The listing says:

We’re looking for two journey-level developers who care about clean architecture, thrive in Agile teams, and see modern tooling—including AI-assisted development—as a way to work smarter, not riskier. You’ll collaborate with product, architecture, security, and platform engineering partners to design secure, accessible, scalable applications that replace decades of legacy complexity.

Use AI-Assisted Tools Responsibly and Effectively

  • Leverage AI tools for boilerplate, test generation, and safe refactoring.
  • Validate AI-generated code for accuracy, security, and maintainability — no blind trust, no shortcuts.
  • Share best practices with teammates adopting new AI workflows.

It’s hard for me to imagine a better morally justified use of AI for software development. This is a government office whose charter is to “protect and sustain healthy land, air, water, and climate in harmony with a strong economy”. They are literally working to make the world better for nature and humans.

But as a government department, they are also really limited in their resources. If AI can make those two software engineers more productive, it can take the same tax dollars as input and produce a healthier world as output. That’s hard to argue against.

Meaning#meaning

When people get excited about AI and productivity, I think utility is what they have in mind. But utility is not the only kind of value a thing can have.

A couple of years ago, I hand-knitted a scarf for my mother-in-law. The scarf does have utility. It’s a rectangle of fabric that keeps her neck warm. The Olympic Peninsula is a lovely corner of the country, but it’s often a chilly one. A scarf made out of wet tissue paper would not have the same utility.

I could have just gone to the store and bought her a scarf. Were I the kind of robotic rational actor who thinks only about maximizing their utility function when not sipping pour-overs in some culty rationalist commune outside of SF, then I would know that the optimal strategy is to use my software engineer skills to maximally turn my labor into cash and then use that cash to buy a scarf.

The scarf I hand-made took dozens of hours to make. At even a modest West Coast software engineer salary, that number of hours would buy an extremely nice scarf. Maybe not vicuña, but at least cashmere. Had I dropped a few hundred bucks on a scarf, would it have more value? Realistically, it would be a better scarf. Softer material, finer stitches. Certainly more fashionable.

You and I know the answer is “no”. As any parent who lovingly clips their kid’s objectively terrible art to the fridge knows, the value of a made thing is not entirely based on its merits. That scarf has value not because it’s a great scarf but because I chose to spend an irreplaceable fraction of my life making it. It’s a symbol of how much I care about her.

In short, it has meaning. Meaning doesn’t make an object more useful. It’s not even transferrable. If my mother-in-law gave the scarf to you, its meaning would evaporate.

But I bet that if you think about the objects that have the most value to you, your S-tier will be populated by things that excel mostly in meaning, not utility. The things that you would grab before running out of a burning house are the things with sentimental, not functional value.

Source of meaning#source-of-meaning

You kind of know the answer intuitively, but I’m trying to be clear in my thinking, so when I talk about objects having meaning… where does that meaning come from? Why do some objects have more meaning than others?

I’ve spent hundreds of hours knitting over the past couple of years, mostly making things for friends and family. I’ve thought about this question a lot. I suspect that it comes down to spending time in service of another.

Humans are sentient actors with agency and entire cognitive universes in our heads. But we are also animals made of meat and bones, and that flesh doesn’t last forever. Every one of us has a finite amount of things we will ever do. “Spend time” is one of those stock phrases that has been sucked dry of semantic flavor over the years, but there is a profound metaphor there. When you choose to use some of your time making a thing for someone, you don’t get that time back. It is spent.

Choosing to spend a fraction of our most precious resource for someone else is the strongest signal we can send that that person matters to us.

I’m in my late 40s. I hope I still have a lot of living left to do, but the odds are very good that I’m past the mid-point of my lifespan. An increasing fraction of conversations with my friends revolve around our various medical tribulations. I can feel the finiteness of life in my bones, quite literally. When I think about what I want to do with my remaining allotment of life, the answer that resonates is taking care of and making things for the people I love.

Generating meaning#generating-meaning

Generative AI, when wielded deftly, can be an amazing tool for creating things with utility faster and more easily than you ever could before. But it can’t generate meaning. The giant matrix of floating point numbers in a rack of GPUs in some data center does not love you.

Another story: When my brother and I were growing up, we were really into movies. We made short videos (hilariously bad), learned how to do special effects make-up (actually tolerably good), and all sorts of stuff like that. We dreamed about growing up and becoming another pair of Hollywood brothers like the Zuckers or Coens.

Many years later, as a birthday present, I wrote my brother a screenplay for a short horror film about a mythological siren. I toiled on it every night after the kids went to bed for weeks. It’s one of my favorite gifts.

I don’t know if we’ll ever get a chance to shoot it. We live on opposite sides of the country and he can’t handle the gloom of Seattle any more than I can handle the politics of the South. It’s likely this screenplay has zero utility. But it still has a ton of meaning because I sweated every single word in that stack of 12-point Courier pages.

Today, with the help of ChatGPT, I could probably put together a feature-length screenplay in a tenth of the time. It might even be an objectively better screenplay for a better movie. But because I made the screenplay in a tenth of the time thanks to ChatGPT’s help, it would hold only a tenth of the meaning for my brother. If my hypothesis that meaning comes from time sacrifice is true, then by making us more productive, AI eliminates meaning.

Both kinds of value#both-kinds-of-value

Most objects, digital or material, have a mixture of both kinds of meaning. My favorite gifts to give or receive are those rare ones that are both thoughtful and useful.

An object carries with it the weight of time that was put into making it and has a certain heft from what you can do with it. I think of AI as a tool that can transmute that former kind of value into the latter. It drains some of the meaning out, but in return lets you make more things with utility for the same effort.

There is nothing unique to AI about this. Any tool that increases efficiency has the same property. They make little knitting machines that make rows of stitches as easily as turning a crank. I could have made a scarf for my mother-in-law using that in a fraction of the time.

Efficiency—the multiplier that determines how much effort is required to make an object—works sort of like a slider that lets us choose the ratio of utility and meaning the resulting thing has. Generative AI, assuming the claims are true, has the capability to radically move that slider. It can produce huge volumes of stuff with real utility. But that exact same ease implies that the results are all devoid of a sense of personal meaning.

Two tangents#two-tangents

I don’t want to take the idea that efficiency is a slider between meaningful-but-scarce and meaningless-plenty too far. It’s a rough model. There are at least two problems with it:

First, it suggests that you can make an object infinitely meaningful by just making the process arbitrarily hard. I could have knitted that scarf using two leftover take-out Chinese bamboo chopsticks. In the dark. One-handed. That would certainly have required a greater sacrifice of time (and sanity) on my part. But there’s obviously a point of diminishing returns where people value the effort we put into making things but less so when we elect to be masochists.

Also, when making things, we often derive personal joy from the process itself. Yes, I did sacrifice time to knit a scarf but… I like knitting. It wasn’t an entirely miserable toil or something. I got to play with yarn.

The high level point is just that the more we automate the process of making a thing, the less of ourselves we put into it. And an object with less of ourselves in it is often valued less by the person who receives it. That’s all I’m saying.

What kind of value do you want?#what-kind-of-value-do-you-want

The tools we use to make things and the resulting productivity gives us a lever to control how much we prioritize getting a useful object out fast versus getting an object suffused with meaning. That suggests a way to decide when we should and shouldn’t use AI.

If you just need a thing that does a thing, then its utility is what matters. I want the Washington Department of Ecology to use my very limited tax dollars as effectively as they can to keep the Pacific Northwest the enchanting natural marvel that it is. If they can do that better using AI, then by all means they should do so.

If you want to make a thing that has some emotional resonance or some connection to other people—whether that be a loved one, an audience, or humanity as a whole—then you don’t want to suck out any of its meaning.

I listen to a lot of electronic music while I work. In some sense, it is “utility music”. I just want a vibe to help me tune out the world and focus. I can’t even handle lyrics while I’m programming or writing.

I have to admit that the genres I listen to are the ones most amenable to being automatically generated by machines. I literally do listen to generative ambient which has been a little corner of the music world since well before “generative” glommed onto the word “AI”.

Even so, I still find my listening experience more gratifying when the music was made by someone who actually gave a shit about it. Even though the relationship between artist and listener is vague and indirect, that connection matters to me. I know that when I make music, the thing that helps me push through all of the many frustrations of producing is the hope—however tentative—that at the end of it all, one day someone might put on a track of mine and find their day improved by it being the temporary soundtrack to their life.

Those moments of connection between artist and audience are what I live for as a creative person. For those, I can’t see generative AI as anything but harmful.

Sometimes yes and sometimes no#sometimes-yes-and-sometimes-no

Thinking about the things I make and consume and the kinds of value they contain helps me sort out some of my extremely conflicted feelings around AI.

I want to emphasize that I’m only looking at this from the perspective of a single person’s individual use of AI. The global effects of AI are just as if not more important. But I haven’t sorted out my feelings around that to have any idea what to say here. Maybe I never will.

But I can set that aside and imagine a world where I can run a generative AI completely locally on my machine. A world where I was the only person on Earth who had an AI. In that timeline, there would be almost no externalities to consider. Even then, I still want to be mindful of the consequences of using AI to help me make.

In my own uses, I aim to focus on the places where I think AI can improve the efficiency of making utilitarian goods. And when I’m making something where the human connection and meaning are important, I will try to put as much of myself and as little of the machine into it as I can.