Show HN: Flex – transpile natural language to a programming language
github.comFlex [1] is a transpiler that converts a series of statements in natural language to either Python, Java or C++, and is trained using Rasa NLU [2], an open-source framework usually used for training chatbots and voice assistants.
Flex is a project I made in college in 2018 along with two of my classmates, Gourav and Sanjay. The initial idea was to create a voice assistant you could speak to that wrote code for you, similar to GitHub's upcoming (and a lot more advanced) project "Hey, Github" [3].
We then had to reduce the project's scope to complete it in a single semester while writing exams, preparing for job interviews and submitting reports and assignments. So we settled on creating a Python program that takes a file containing statements written in natural language as input and transpiles them to a valid Python/Java/C++ program. The project design is modular, so adding more languages and statement types is easy.
The next step we had thought of was to hook it up with some UI with voice input to have some kind of voice-driven IDE, but then as college was over and we got jobs, the project got abandoned as we moved on to other things.
Seeing the "Hey, GitHub" project on HackerNews today [4] reminded me of this project, so I just wanted to share it to inspire others to fork it and make something cool. Also looking forward to some feedback on how the project could be improved to make it more useful.
[1]: https://github.com/Flex-lang/transpiler. Not to be confused with FLEX, a strictly-typed programming language with the same name (which was released first, so we should have used a different name!)
[2]: https://github.com/rasahq/rasa
[3]: https://githubnext.com/projects/hey-github/
[4]: https://news.ycombinator.com/item?id=33543946 Can anyone explain to me what this does? I would have expected something like GPT-3 Codex or GitHub Copilot. But the example on the page looks a lot different. Specifically, I don’t see any natural language input. At the moment it can recognise the type of statements in the training data set [1] and transpile them to Python, Java or C++ using the mappings defined here [2]. This is very different from how Codex/Autopilot work as it is trained using an NLU framework [3] which is usually used for training chatbots. [1]: https://github.com/Flex-lang/transpiler/tree/master/transpil... [2]: https://github.com/Flex-lang/transpiler/tree/master/transpil... So I can input pseudo code and it will turn that into real code? Is that a proper description? Can you give me a use case? Thank you. Yes, that's a way to describe it. One use case could be that a complete beginner to programming can enter some natural commands (e.g. "X is a list of numbers", "read 10 values in X") without worrying about the syntax and convert them to a working program in one of the target languages. Also, the input doesn't have to be in English as it can be easily trained with samples from other languages. So people can write/speak in their native language and get it translated to working code. Also flex: fast lexical analyzer generator: https://en.wikipedia.org/wiki/Flex_(lexical_analyser_generat... Edit: In case the parentheses don't work (percentage encoding didn't), try: https://github.com/westes/flex I've been experimenting with chatbot frameworks lately and was getting a little bored of building toy ones, so this might be an interesting idea for me to explore! The dependencies seem a little old though, you should think about keeping them updated. this is so awesome!