When most people think of a bird, they think of a perching bird: a wren, robin, or chickadee.
They do not think of a duck. Unless there happens to be a duck present.
And any (sane) adult would respond appropriately when asked the question "if a bird sings, what does a duck do?"
They’d quickly answer "it quacks."
The expectation is implicit. They would not and do not expect the duck to sing like they would a bird. It would go against everything they knew about ducks.
While traditional, deterministic programming is a form of programming, non-deterministic agentic programming is a style of programming, too. In principle, they’re both "a type of bird."
Deterministic programs sing. Given a set of inputs, built and tested properly, they will always give the same predictable output.
Non-deterministic programs quack. Given a set of inputs, prompted properly, they will not always give the same predictable output.
The second system—our techno-duck—isn’t a perching bird that’s having an identity crisis. It’s a duck, doing what ducks do: quacking.
Right now, there’s a big push to integrate large language models (LLMs, what "AI" means in most conversations) into damn near everything. Results vary, as one would expect, because LLMs rely on probability to determine their response.
This is why two different people can present an identical prompt to the exact same LLM and get two completely different results. Even with identical context; consistent responses are the exception, not the rule.
As an individual pecking away on a smartphone, LLMs are downright magical.
As a business reliant on the outputs of an LLM? Misapplied, it’s a disaster waiting to happen.
Maybe not today. Maybe not tomorrow. You may even get away with things being peachy for years.
But one day, that duck is going to stop singing and discover its true self. It will quack. When it does, you don’t want to be the one who swore on your own life that the duck would, in fact, never quack.