remove samples
This commit is contained in:
11
README.md
11
README.md
@ -1,6 +1,6 @@
|
||||
# gpt-2
|
||||
|
||||
Code and samples from the paper ["Language Models are Unsupervised Multitask Learners"](https://d4mucfpksywv.cloudfront.net/better-language-models/language-models.pdf).
|
||||
Code from the paper ["Language Models are Unsupervised Multitask Learners"](https://d4mucfpksywv.cloudfront.net/better-language-models/language-models.pdf).
|
||||
|
||||
We have currently released small (117M parameter) and medium (345M parameter) versions of GPT-2. While we have not released the larger models, we have [released a dataset](https://github.com/openai/gpt-2-output-dataset) for researchers to study their behaviors.
|
||||
|
||||
@ -30,15 +30,6 @@ See [DEVELOPERS.md](./DEVELOPERS.md)
|
||||
|
||||
See [CONTRIBUTORS.md](./CONTRIBUTORS.md)
|
||||
|
||||
## GPT-2 samples
|
||||
|
||||
| WARNING: Samples are unfiltered and may contain offensive content. |
|
||||
| --- |
|
||||
|
||||
While we have not yet released GPT-2 itself, you can see some samples from it in the `gpt-2-samples` folder.
|
||||
We show unconditional samples with default settings (temperature 1 and no truncation), with temperature 0.7, and with truncation with top_k 40.
|
||||
We show conditional samples, with contexts drawn from `WebText`'s test set, with default settings (temperature 1 and no truncation), with temperature 0.7, and with truncation with top_k 40.
|
||||
|
||||
## Citation
|
||||
|
||||
Please use the following bibtex entry:
|
||||
|
Reference in New Issue
Block a user