jaymody / picoGPT
- воскресенье, 12 февраля 2023 г. в 02:00:16
An unnecessarily tiny and minimal implementation of GPT-2 in NumPy.
You've seen openai/gpt-2.
You've seen karpathy/minGPT.
You've even seen karpathy/nanoGPT!
But have you seen picoGPT??!?
picoGPT
is an unnecessarily tiny and minimal implementation of GPT-2 in plain NumPy. The entire forward pass code is 40 lines of code. I wrote a related blog post for picoGPT.
picoGPT features:
gpt2.py
gpt2_pico.py
A quick breakdown of each of the files:
encoder.py
contains the code for OpenAI's BPE Tokenizer, taken straight from their gpt-2 repo.utils.py
contains the code to download and load the GPT-2 model weights, tokenizer, and hyper-parameters.gpt2.py
contains the actual GPT model and generation code which we can run as a python script.gpt2_pico.py
is the same as gpt2.py
, but in even fewer lines of code. Why? Because why not pip install -r requirements.txt
If you're using an M1 Macbook, you'll need to replace tensorflow
with tensorflow-macos
.
Tested on Python 3.9.10
.
python gpt2.py "Alan Turing theorized that computers would one day become"
Which generates
the most powerful machines on the planet.
The computer is a machine that can perform complex calculations, and it can perform these calculations in a way that is very similar to the human brain.
You can also control the number of tokens to generate, the model size (one of ["124M", "355M", "774M", "1558M"]
), and the directory to save the models:
python gpt2.py \
"Alan Turing theorized that computers would one day become" \
--n_tokens_to_generate 40 \
--model_size "124M" \
--models_dir "models"