Vibing on the fly by having an LLM write functions during runtime
github.comI recently saw a [meme](https://www.reddit.com/r/AICompanions/comments/1ph8w96/devel...) about modern programmers implementing code in the spirit of
def is_odd(num):
return OpenAI.chat(f"Is {num} odd?")
To me this felt like it wouldn't scale very well, with custom prompts needed for every function. So I implemented a decorator that you can use to ask `gpt-5-mini` to implement your functions during runtime based on function name, docstring[Optional] and type annotations [Optional]. In practice it looks like this: from cursed_vibing_on_the_fly import ai_implement
@ai_implement
def check_if_number_is_prime(n):
pass
To me this makes the code feel so much more dynamic and, dare I say, vibrant!/s