cp .env.sample .env
set -a; source .env; set +a
docker-compose pull app
docker-compose up app
To use Snorkel, spaCy and Janome:
docker-compose up snorkel
Service app
includes:
- Python 3
- Google Cloud Datalab with many libraries, e.g. Jupyter, NLTK, NumPy, pandas, scikit-learn, SciPy etc.
- MeCab
- gensim
- spaCy
Service snorkel
includes:
Copy .env.sample
to .env
and edit it to define environment variables for Docker.
cp .env.sample .env
set -a; source .env; set +a
docker-compose build
You can set up Docker Hub and GCR to build the image directly from your GitHub repository.
docker-compose up app # start the container
docker-compose exec app bash # enter container
gcloud auth login # in container
docker-compose push app # to Docker Hub
or
docker tag $IMAGE_NAME:$IMAGE_TAG gcr.io/$PROJECT_ID/datalab-nlp:latest
gcloud docker -- push gcr.io/$PROJECT_ID/datalab-nlp:latest
docker-compose pull app
or
gcloud docker -- pull gcr.io/$PROJECT_ID/datalab-nlp
From GCloud Shell, you should be able to run:
datalab create nlp --image-name $IMAGE_NAME:$IMAGE_TAG
or
datalab create nlp --image-name gcr.io/$PROJECT_ID/datalab-nlp:latest
Datalab
can not pull a custom image from GCR.