This commit is contained in:
Jeff Wu
2019-02-28 15:51:34 -08:00
parent 9d1e704fec
commit 0465394bd3

View File

@ -101,8 +101,23 @@ While we have not yet released GPT-2 itself, you can see some samples from it in
We show unconditional samples with default settings (temperature 1 and no truncation), with temperature 0.7, and with truncation with top_k 40.
We show conditional samples, with contexts drawn from `WebText`'s test set, with default settings (temperature 1 and no truncation), with temperature 0.7, and with truncation with top_k 40.
## Citation
Please use the following bibtex entry:
```
@article{radford2019language,
title={Language Models are Unsupervised Multitask Learners},
author={Radford, Alec and Wu, Jeff and Child, Rewon and Luan, David and Amodei, Dario and Sutskever, Ilya},
year={2019}
}
```
## Future work
We may release code for evaluating the models on various benchmarks.
We are still considering release of the larger models.
## License
MIT