The Human Thread: Finding Hope in the Age of Artificial Intelligence

8 min read Original article ↗
From my trip at the Computer History Museum

In 2023, I visited the Computer History Museum in Mountain View, California — an experience that felt strangely personal. The building itself sits where my dad occasionally worked when he was with Silicon Graphics (SGI) in the 1990s. His main office had been in Wisconsin, but every so often he’d be sent out west. I’d never seen the California location until that day, long after SGI’s glory years had faded, and it felt surreal walking through those halls as a visitor rather than as his child waiting for him to come home from a business trip.

Then I turned a corner and saw the displays for Control Data and Cray Research — the two supercomputer companies my dad worked for before SGI bought out Cray. There was something quietly moving about it. These weren’t just plaques about the history of computing; they were traces of my own family’s story, woven into the early age of machines that tried to think faster than human minds could keep up.

Maybe that’s why I’ve never feared technology. I was raised alongside it. As a Xenial — that small bridge generation between Gen X and Millennials — I remember the analog years just well enough to appreciate how far we’ve come, and just young enough to keep adapting as the world digitized. My dad brought home early personal computers when most families still had rotary phones. To me, technology was never cold or alien; it was simply part of how we learned, worked, and expressed ourselves.

But at the Computer History Museum, something shifted. The exhibit turned from code and silicon to something much older: the first mechanized looms of the early 19th century. These machines didn’t compute numbers or logic — they computed pattern. Using punch cards to guide threads, they could weave intricate designs faster than any human hand.

Before the invention of mechanized looms, fabric making was an art form passed down through apprenticeship. A weaver’s skill determined not only the beauty of the cloth but its value. Threads were dyed, spun, and woven by hand, often in small workshops or homes. A patterned fabric — brocade, damask, or tapestry — was a luxury item, its price a testament to the countless hours of human labor it required.

That world changed with the Jacquard loom in 1804. For the first time, patterns could be encoded into punch cards — precursors to computer programs — and replicated endlessly. Production soared. So did accessibility. Fabric that once took weeks to weave could now be made in a fraction of the time, at a fraction of the cost.

But there was a cost beyond the economic one: the displacement of skilled workers. For many weavers, this new technology wasn’t progress; it was betrayal. The machine could now perform, faster and cheaper, what had once required a lifetime of mastery. The upheaval it caused in the early 1800s foreshadowed the conversations we’re having today about artificial intelligence.

Two centuries later, the world of fabric design remains split between craft and code.

Modern textile designers rely on computer-aided design (CAD) systems, digital Jacquard looms, precision printing technologies, and increasingly, AI-driven tools. Patterns that once demanded the skill of a loom operator can now be rendered digitally and executed by machines with breathtaking detail. The designer’s role has shifted: they imagine, simulate, and refine — often without ever touching a loom.

These designers make a comfortable living compared to their 19th-century predecessors. In the U.S., the average fabric designer earns around $60,000–$100,000 per year, depending on experience and industry. Those working for fashion houses or large manufacturers can earn more, while freelance designers often receive royalties of about 5% of the wholesale price per yard sold — a business model that rewards creativity over production.

Yet alongside this industrial precision, the handweaving tradition still endures. In India, Peru, Japan, the U.S., and across Europe, artisan weavers continue to work on manual looms, creating pieces that machines cannot replicate — at least not perfectly. These textiles are prized not just for their beauty but for their humanity: the slight irregularities, the physical rhythm of human touch, the story woven into each thread.

Handweavers rarely make mass-production income, but many carve out sustainable livelihoods through cooperatives, art markets, and direct-to-consumer sales. A skilled artisan today might earn $25,000–$50,000 annually from handwoven goods, with luxury or heritage specialists earning more for bespoke commissions.

By contrast, in the early 1800s, a peak hand-loom weaver earned roughly £1 a week — the equivalent of about $6,000–$7,000 per year in today’s money. Within a few decades, as mechanized looms spread, wages for ordinary weavers collapsed to the equivalent of just a few thousand dollars per year, driving many into poverty. Yet over time, as handweaving evolved from common labor into an artisan craft, its value slowly rebounded — no longer measured by speed or volume, but by the rarity of skill itself.

Their value — then and now — lies not in speed but in authenticity: the same way a hand-thrown pot or hand-bound book carries meaning beyond its function.

And that’s where the parallel with writing becomes unavoidable.

For most of modern publishing history, authors lived in what you might call the handweaving era of books. Up until the rise of self-publishing in 2009, writers weren’t selected by publishers purely because their prose shone or their storytelling was transcendent. Those things helped, of course, but the determining factor was always market fit. A manuscript was chosen because an editor believed it could be shaped into something that would sell.

Once a book was acquired, the real weaving began. A developmental editor would ask the author to add chapters, cut subplots, shift characters, or alter pacing — not to suit artistic whims, but to match audience expectations. From there, teams of specialists handled everything: copyediting, proofreading, typesetting, interior layout, illustration, cover design. The author supplied the thread, but the publisher operated the loom.

Then came 2009, and with it the self-publishing revolution — the industry’s own Jacquard moment. Print-on-demand and digital platforms allowed writers to bypass the gatekeepers entirely. Anyone could publish a book. Predictably, the earliest indie books were uneven: amateur covers, clumsy interiors, minimal editing. Yet the door was open, and a flood of new voices poured through it.

A few of those voices changed everything. The Shack, The Martian, Fifty Shades of Grey — all began as self-published works that exploded in visibility and forced the wider industry to take indie publishing seriously.

Traditional authors noticed, too. Many realized they could earn far more by releasing work themselves and soon crossed over, bringing professional skillsets — and competition — with them.

As the market swelled, the message to aspiring indies shifted:
If you want your book to stand out, you must treat it like a business.
Avoid vanity presses. Hire freelancers. Pay for editing. Pay for design. Shoulder the financial risk the publisher used to take.

Over time, technology softened the edges of that challenge. Tools like Canva, PicMonkey, and affordable stock-photo libraries made it possible to produce polished covers without a professional design background. Formatting programs grew more sophisticated. Digital resources democratized the visual side of bookmaking the way mechanized looms democratized patterned fabric. What once required specialized training could now be achieved with some patience and a few online tutorials.

But one barrier remained: editing. Developmental feedback, line-level refinement, and meticulous proofreading stayed labor-intensive and expensive. These were the human rhythms of the craft — the parts that resisted automation. Some authors tried to skip them; others paid for underqualified editors and hoped for the best. Yet the gap between a raw manuscript and a publishable book remained substantial.

Now we find ourselves in the midst of another upheaval: the rise of AI-assisted writing. And just like the Jacquard loom, it’s prompting both excitement and anxiety. Writers worry about being replaced, about losing control of their voice or their livelihood. But the truth is that publishing has always been a shifting landscape with complicated questions around ownership, credit, and creative control. AI doesn’t introduce instability so much as join a long lineage of tools that accelerated parts of the process.

Instead of digging our heels in or spiraling into dystopian predictions, we have another option: approach the moment with creativity and optimism. Writing has never been about producing a flawless first draft. It has always been a process — imagining, reshaping, refining, discovering. Used thoughtfully, AI can lighten the cognitive load of some of that process, reduce burnout, spark ideas, and help writers maintain momentum rather than stall out.

It doesn’t replace the weaver. It simply changes the loom. And that change doesn’t mean traditional old-fashioned, even pen-to-paper writing, has to go away.

Over the next few weeks, I’ll explore that process more deeply — what writing looked like for me a decade ago when I created Devil’s Lake, what I learned from college workshops and critique groups long before AI existed, and where today’s tools (AI and otherwise) can help modern writers craft their stories with more clarity, efficiency, and confidence.