updates
This commit is contained in:
15
README.md
15
README.md
@ -101,8 +101,23 @@ While we have not yet released GPT-2 itself, you can see some samples from it in
|
||||
We show unconditional samples with default settings (temperature 1 and no truncation), with temperature 0.7, and with truncation with top_k 40.
|
||||
We show conditional samples, with contexts drawn from `WebText`'s test set, with default settings (temperature 1 and no truncation), with temperature 0.7, and with truncation with top_k 40.
|
||||
|
||||
## Citation
|
||||
|
||||
Please use the following bibtex entry:
|
||||
```
|
||||
@article{radford2019language,
|
||||
title={Language Models are Unsupervised Multitask Learners},
|
||||
author={Radford, Alec and Wu, Jeff and Child, Rewon and Luan, David and Amodei, Dario and Sutskever, Ilya},
|
||||
year={2019}
|
||||
}
|
||||
```
|
||||
|
||||
## Future work
|
||||
|
||||
We may release code for evaluating the models on various benchmarks.
|
||||
|
||||
We are still considering release of the larger models.
|
||||
|
||||
## License
|
||||
|
||||
MIT
|
||||
|
Reference in New Issue
Block a user