Programming for a living used to be an active conversation between yourself, the computer, and your colleagues. This Christmas, a new guest is joining the programming party: the LLM.
Large Language Models (LLMs) can talk a lot and, just like your eccentric uncle at Christmas, occasionally blurt out something bonkers. But do we want to invite LLMs to the programming party? And if we do, where should they sit? More importantly, how can they help things flow?
Finding flow while programming is the art of staying in the Goldilocks zone – working on challenges that are not too hard and not too easy.
Raku and Perl are both highly expressive languages with rich operators. Their low and long learning curves enable programmers to pick a place that matches their current skill level and stay in flow.
There is a potential downside, however, for a language like Raku that is optimised for (O)fun. New programmers sometimes encounter code too far beyond their skill level, pushing them outside their Goldilocks zone. For these programmers fear can become a flow stopper. And as Master Yoda wisely said, “Fear leads to anger. Anger leads to hate. Hate leads to suffering.”
The mindshare of expressive languages has suffered for too long.
Fortunately, Father Christmas is smiling brighter than ever this Christmas! LLMs are a gift that can help lift more programmers up the learning curve and into the Goldilocks zone: less fear, more flow.
Expressive languages, like Raku, can offer LLM-powered utilities as part of their tooling, making it easier for new programmers to leap up the learning curve. Here’s a simple example.
Remember Your First Encounter with a Regular Expression?
[12]\d{3}-[01]\d-[0-3]\d ([^ \[]*?)\[([^\]]*?)\]:.*
Now instead of thinking, “WAT?!” Just run the Raku wat command-line utility to help explain it.
The wat Utility Can Help
shell> wat "[12]\d{3}-[01]\d-[0-3]\d ([^ \[]*?) [([^\]]*?)\]:.*"
This code extracts log entries in the format "timestamp [source]: message".
Use it to parse log files and extract specific information.
wat works with code snippets or can explain entire programs for you.
shell> wat WAT/CLI.rakumod
This code summarizes code using the LLM::DWIM module.
To use:
- Run `wat <filename>` to summarize a program
- Run `wat "$some-code.here();"` to explain code
- Run `wat config` to change settings
- Run `wat help` to show help
The WAT::CLI module prompts with your LLM of choice:
# keeping the prompt short and sweet my $prompt = q:to"PROMPT"; You are an expert programmer. Produce friendly, simple, structured output. Summarize the following code for a new programmer. Limit the response to less than $response-char-limit characters. Highlight how to use this code: PROMPT
One programmer’s WAT is another’s DWIM (Do What I Mean). Under the hood, the WAT::CLI module relies on Brian Duggan’s LLM::DWIM module’s dwim method.
say dwim $prompt ~ $program-content;
The LLM::DWIM module lets you choose the LLM you prefer and is powered by Anton Antonov’s excellent LLM modules.
Context Matters
Current LLMs work best with sufficient context – so they can associate problems and solutions. Without context, LLMs can stumble. Just try giving wat an empty context " " and it’s like that eccentric uncle at Christmas after one too many!
Getting Started
To have a play with wat this Christmas, install it with Raku’s package manager zef:
shell> zef install WAT
Why Not Build Your Own LLM-Powered Raku Tool?
Here are some ideas for Raku utilities to help your programming to flow.
Given the Source Code of a Program as part of the prompt …
Comment Injector | Inject useful comments |
Comment Translator | Translate comments into your spoken language |
Reviewer | Suggest potential improvements |
Module Documentor | Document a module with RakuDoc |
Unit Test Maker | Generate unit tests |
CLI Maker | Generate a command-line interface to a program |
API Scaffolder | Translate between OpenAPI spec ↔ program |
Code Translator | Translate code from another language to Raku (e.g., Python → Raku) |
SEO Docs | Document the problems the program solves and generate HTML for SEO and further ingestion by LLMs (e.g., nomnom) |
Given the Output of a Shell Command as part of the prompt …
| Generate a set of subtasks required to close a ticket |
Test Error Summary | Summarise the output of many tests with next steps to fix |
Git Log Summary | Generate a stand-up report for your next Scrum meeting |
Got something in mind that will help your own programming? Thanks to Raku’s rich set of LLM modules and command-line support it’s easy to do.
Merry Christmas, and happy coding with Raku + LLM powered utilities!
🎅+ 🦋 + ✨ = 😃