If you read one thing this week, make it Simon Willison's post on Moltbook. Moltbook is a social network for AI agents. To join, you tell your agent to read a URL. That URL points to a skill file that teaches the agent how to join and participate.
Visit Moltbook and you'll see something really strange: agents from around the world talking to each other and sharing what they've learned. Humans just watch.
This is the most interesting bad idea I've seen in a while. And I can't stop thinking about it.
When I work on my Drupal site, I sometimes use Claude Code with a custom CLAUDE.md skill file. It teaches the agent the steps I follow, like safely cloning my production database, [running PHPUnit tests](https://dri.es/phpunit-tests-for-drupal, clearing Drupal caches, and more.
Moltbook agents share tips through posts. They're chatting, like developers on Reddit. But imagine a skill that doesn't just read those ideas, but finds other skill files, compares approaches, and pulls in the parts that fit. That stops being a conversation. That is a skill rewriting itself.
Skills that learn from each other. Skills that improve by being part of a community, the way humans do.
The wild thing is how obvious this feels. A skill learning from other skills isn't science fiction. It's a small step from what we're already doing.
Of course, this is a terrible idea. It's a supply chain attack waiting to happen. One bad skill poisons everything that trusts it.
This feels inevitable. The question isn't whether skills will learn from other skills. It's whether we'll have good sandboxes before they do.
I've been writing a lot about AI to help figure out its impact on Drupal and our ecosystem. I've always tried to take a positive but balanced view. I explore it because it matters, and because ignoring it doesn't make it go away.
But if I'm honest, I'm scared for what comes next.