Skip to content
GitLab
Explore
Sign in
Primary navigation
Search or go to…
Project
G
gpt-2-server
Manage
Activity
Members
Labels
Plan
Issues
Issue boards
Milestones
Wiki
Code
Merge requests
Repository
Branches
Commits
Tags
Repository graph
Compare revisions
Snippets
Build
Pipelines
Jobs
Pipeline schedules
Artifacts
Deploy
Releases
Model registry
Operate
Environments
Monitor
Incidents
Analyze
Value stream analytics
Contributor analytics
CI/CD analytics
Repository analytics
Model experiments
Help
Help
Support
GitLab documentation
Compare GitLab plans
GitLab community forum
Contribute to GitLab
Provide feedback
Keyboard shortcuts
?
Snippets
Groups
Projects
Show more breadcrumbs
Paul Geisler
gpt-2-server
Commits
79a246a5
Commit
79a246a5
authored
Mar 6, 2019
by
Jeff Wu
Browse files
Options
Downloads
Patches
Plain Diff
add contributors md and move dev docs out
parent
953530fc
Branches
Branches containing commit
No related tags found
No related merge requests found
Changes
3
Show whitespace changes
Inline
Side-by-side
Showing
3 changed files
CONTRIBUTORS.md
+17
-0
17 additions, 0 deletions
CONTRIBUTORS.md
DEVELOPERS.md
+85
-0
85 additions, 0 deletions
DEVELOPERS.md
README.md
+4
-82
4 additions, 82 deletions
README.md
with
106 additions
and
82 deletions
CONTRIBUTORS.md
0 → 100644
+
17
−
0
View file @
79a246a5
# Contributors (alphabetically)
*
**[madisonmay](https://github.com/madisonmay)**
Added Dockerfiles
*
**[Margaret Mitchell et al](https://arxiv.org/abs/1810.03993)**
Our
[
usage
](
./readme#usage
)
writeup was loosely inspired by the paper
[
Model Cards for Model Reporting
](
https://arxiv.org/abs/1810.03993
)
and related conversations with some of the authors.
*
**[webproduktion01](https://github.com/webproduktion01)**
Ported download script to python.
**[Full code contributors list](https://github.com/openai/gpt-2/contributors).**
This diff is collapsed.
Click to expand it.
DEVELOPERS.md
0 → 100644
+
85
−
0
View file @
79a246a5
# Installation
Git clone this repository, and
`cd`
into directory for remaining commands
```
git clone https://github.com/openai/gpt-2.git && cd gpt-2
```
Then, follow instructions for either native or Docker installation.
## Native Installation
All steps can optionally be done in a virtual environment using tools such as
`virtualenv`
or
`conda`
.
Install tensorflow 1.12 (with GPU support, if you have a GPU and want everything to run faster)
```
pip3 install tensorflow==1.12.0
```
or
```
pip3 install tensorflow-gpu==1.12.0
```
Install other python packages:
```
pip3 install -r requirements.txt
```
Download the model data
```
python3 download_model.py 117M
```
## Docker Installation
Build the Dockerfile and tag the created image as
`gpt-2`
:
```
docker build --tag gpt-2 -f Dockerfile.gpu . # or Dockerfile.cpu
```
Start an interactive bash session from the
`gpt-2`
docker image.
You can opt to use the
`--runtime=nvidia`
flag if you have access to a NVIDIA GPU
and a valid install of
[
nvidia-docker 2.0
](
https://github.com/nvidia/nvidia-docker/wiki/Installation-(version-2.0
)
).
```
docker run --runtime=nvidia -it gpt-2 bash
```
# Running
| WARNING: Samples are unfiltered and may contain offensive content. |
| --- |
Some of the examples below may include Unicode text characters. Set the environment variable:
```
export PYTHONIOENCODING=UTF-8
```
to override the standard stream settings in UTF-8 mode.
## Unconditional sample generation
To generate unconditional samples from the small model:
```
python3 src/generate_unconditional_samples.py | tee /tmp/samples
```
There are various flags for controlling the samples:
```
python3 src/generate_unconditional_samples.py --top_k 40 --temperature 0.7 | tee /tmp/samples
```
To check flag descriptions, use:
```
python3 src/generate_unconditional_samples.py -- --help
```
## Conditional sample generation
To give the model custom prompts, you can use:
```
python3 src/interactive_conditional_samples.py --top_k 40
```
To check flag descriptions, use:
```
python3 src/interactive_conditional_samples.py -- --help
```
This diff is collapsed.
Click to expand it.
README.md
+
4
−
82
View file @
79a246a5
...
...
@@ -22,91 +22,13 @@ Please [let us know](mailto:languagequestions@openai.com) if you’re doing inte
-
Potential malicious use cases and defenses against them (e.g. the detectability of synthetic text)
-
The extent of problematic content (e.g. bias) being baked into the models and effective mitigations
##
Installation
##
Development
Git clone this repository, and
`cd`
into directory for remaining commands
```
git clone https://github.com/openai/gpt-2.git && cd gpt-2
```
Then, follow instructions for either native or Docker installation.
### Native Installation
All steps can optionally be done in a virtual environment using tools such as
`virtualenv`
or
`conda`
.
Install tensorflow 1.12 (with GPU support, if you have a GPU and want everything to run faster)
```
pip3 install tensorflow==1.12.0
```
or
```
pip3 install tensorflow-gpu==1.12.0
```
See
[
DEVELOPERS.md
](
./DEVELOPERS.md
)
Install other python packages:
```
pip3 install -r requirements.txt
```
Download the model data
```
python3 download_model.py 117M
```
### Docker Installation
Build the Dockerfile and tag the created image as
`gpt-2`
:
```
docker build --tag gpt-2 -f Dockerfile.gpu . # or Dockerfile.cpu
```
Start an interactive bash session from the
`gpt-2`
docker image.
You can opt to use the
`--runtime=nvidia`
flag if you have access to a NVIDIA GPU
and a valid install of
[
nvidia-docker 2.0
](
https://github.com/nvidia/nvidia-docker/wiki/Installation-(version-2.0
)
).
```
docker run --runtime=nvidia -it gpt-2 bash
```
## Contributors
## Sampling scripts
| WARNING: Samples are unfiltered and may contain offensive content. |
| --- |
Some of the examples below may include Unicode text characters. Set the environment variable:
```
export PYTHONIOENCODING=UTF-8
```
to override the standard stream settings in UTF-8 mode.
### Unconditional sample generation
To generate unconditional samples from the small model:
```
python3 src/generate_unconditional_samples.py | tee /tmp/samples
```
There are various flags for controlling the samples:
```
python3 src/generate_unconditional_samples.py --top_k 40 --temperature 0.7 | tee /tmp/samples
```
To check flag descriptions, use:
```
python3 src/generate_unconditional_samples.py -- --help
```
### Conditional sample generation
To give the model custom prompts, you can use:
```
python3 src/interactive_conditional_samples.py --top_k 40
```
To check flag descriptions, use:
```
python3 src/interactive_conditional_samples.py -- --help
```
See
[
CONTRIBUTORS.md
](
./CONTRIBUTORS.md
)
## GPT-2 samples
...
...
This diff is collapsed.
Click to expand it.
Preview
0%
Loading
Try again
or
attach a new file
.
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Save comment
Cancel
Please
register
or
sign in
to comment