Files
gpt-2/README.md

38 lines
913 B
Markdown
Raw Normal View History

2019-02-10 20:22:00 -08:00
# gpt-2
Code and samples from the paper "Language Models are Unsupervised Multitask Learners"
## Installation
2019-02-14 00:17:55 -08:00
Download the (small, 117M parameter) model data:
2019-02-10 20:22:00 -08:00
```
gsutil rsync -r gs://gpt-2/models/ models/
```
Install python packages:
```
pip install -r requirements.txt
```
## Sample generation
| WARNING: Samples are unfiltered and may contain offensive content. |
| --- |
To generate unconditional samples from the small model:
```
python3 src/main.py | tee samples
```
There are various flags for controlling the samples:
```
python3 src/main.py --top_k 40 --temperature 0.7 | tee samples
```
2019-02-14 00:17:55 -08:00
While we have not yet released GPT-2 itself, you can see some unconditional samples (with default settings of temperature 1 and no truncation) in `gpt2-samples.txt`.
## Future work
We may release code for evaluating the models on various benchmarks.
We are still considering release of the larger models.