GitHub - vkalyvayut-robaty/gpt2: bash+awk implementation of the gpt2 124m model

1 min read Original article ↗
how to run
0. create conda environment from conda-env.yml
1. download and generate weights `python utils/export_weights.py`
2. make sure you have installed gawk as a default awk (`awk -V`), mawk has some issues with unicode/utf-8
3. run `source src/main -i "your_text" -n <number_of_tokens_to_generate> -w <path_to_dir_where_weights_was_exported_to> -t <path_to_tmp_dir>`
4. wait about 1-2 hours to see the result, if you think it is too much, consider that at first it was running for more than 36 hours to generate 1 (one) token ¯\_(ツ)_/¯, so this is great improvement
5. enjoy (???)

to run tests activate conda environment, export weights to tests/assets directory and then run `source tests/run_all_tests.sh`