Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Updating references to the project repository to kubeflow/pipelines. #25

Merged
merged 3 commits into from
Nov 2, 2018
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 5 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
[![Build Status](https://travis-ci.com/googleprivate/ml.svg?token=JjfzFsYGxZwkHvXFCpwt&branch=master)](https://travis-ci.com/googleprivate/ml)
[![Build Status](https://travis-ci.com/kubeflow/pipelines.svg?token=JjfzFsYGxZwkHvXFCpwt&branch=master)](https://travis-ci.com/kubeflow/pipelines)

# ML Pipeline Services - Overview

Expand Down Expand Up @@ -112,14 +112,14 @@ gcloud container clusters create $CLUSTER_NAME \
```
Here we choose the cloud-platform scope so the cluster can invoke GCP APIs. You can find all the options for creating a cluster in [here](https://cloud.google.com/sdk/gcloud/reference/container/clusters/create).

Next, grant your user account permission to create new cluster roles. This step is necessary because installing ML Pipelines Services inlcudes installing a few [clusterroles](https://github.com/googleprivate/ml/search?utf8=%E2%9C%93&q=clusterrole+path%3Aml-pipeline%2Fml-pipeline&type=).
Next, grant your user account permission to create new cluster roles. This step is necessary because installing ML Pipelines Services inlcudes installing a few [clusterroles](https://github.com/kubeflow/pipelines/search?utf8=%E2%9C%93&q=clusterrole+path%3Aml-pipeline%2Fml-pipeline&type=).
```bash
kubectl create clusterrolebinding ml-pipeline-admin-binding --clusterrole=cluster-admin --user=$(gcloud config get-value account)
```

## Deploy ML Pipeline Services and Kubeflow

Go to [release page](https://github.com/googleprivate/ml/releases) to find a version of ML Pipeline Services. Deploy the ML Pipeline Services and Kubeflow to your cluster.
Go to [release page](https://github.com/kubeflow/pipelines/releases) to find a version of ML Pipeline Services. Deploy the ML Pipeline Services and Kubeflow to your cluster.

For example:
```bash
Expand Down Expand Up @@ -166,7 +166,7 @@ If you are using Cloud Shell, you could view the UI by open the [web preview](ht
If you are using local console instead of Cloud Shell, you can access the ML pipeline UI at [localhost:8080/pipeline](http://localhost:8080/pipeline).

## Run your first TFJob pipeline
See the following authoring guide on how to compile your python pipeline code into workflow tar file. Then, follow [README.md](https://github.com/googleprivate/ml/blob/master/samples/kubeflow-tf/README.md) to deploy your first TFJob pipeline.
See the following authoring guide on how to compile your python pipeline code into workflow tar file. Then, follow [README.md](https://github.com/kubeflow/pipelines/blob/master/samples/kubeflow-tf/README.md) to deploy your first TFJob pipeline.

## Uninstall
To uninstall ML pipeline, download the bootstrapper file and change the arguments to the deployment job.
Expand All @@ -189,7 +189,7 @@ then create job using the updated YAML by running ```kubectl create -f bootstrap

# ML Pipeline Services - Authoring Guideline

For more details, see [README.md](https://github.com/googleprivate/ml/blob/master/samples/README.md).
For more details, see [README.md](https://github.com/kubeflow/pipelines/blob/master/samples/README.md).

## Setup
* Create a python3 environment.
Expand Down
4 changes: 2 additions & 2 deletions backend/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ RUN curl -fsSL -o /bin/dep https://github.com/golang/dep/releases/download/v0.5.
ADD https://github.com/golang/dep/releases/download/v0.5.0/dep-linux-amd64 /usr/bin/dep
RUN chmod +x /usr/bin/dep

WORKDIR /go/src/github.com/googleprivate/ml
WORKDIR /go/src/github.com/kubeflow/pipelines
COPY . .

# Needed for github.com/mattn/go-sqlite3
Expand All @@ -22,7 +22,7 @@ ENV COMMIT_SHA=${COMMIT_SHA}
WORKDIR /bin

COPY --from=builder /bin/apiserver /bin/apiserver
COPY --from=builder /go/src/github.com/googleprivate/ml/third_party/license.txt /bin/license.txt
COPY --from=builder /go/src/github.com/kubeflow/pipelines/third_party/license.txt /bin/license.txt
COPY backend/src/apiserver/config/ /config
COPY backend/src/apiserver/samples/ /samples

Expand Down
4 changes: 2 additions & 2 deletions backend/Dockerfile.persistenceagent
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ RUN curl -fsSL -o /bin/dep https://github.com/golang/dep/releases/download/v0.5.
ADD https://github.com/golang/dep/releases/download/v0.5.0/dep-linux-amd64 /usr/bin/dep
RUN chmod +x /usr/bin/dep

WORKDIR /go/src/github.com/googleprivate/ml
WORKDIR /go/src/github.com/kubeflow/pipelines
COPY . .

# Needed for github.com/mattn/go-sqlite3
Expand All @@ -19,7 +19,7 @@ FROM alpine
WORKDIR /bin

COPY --from=builder /bin/persistence_agent /bin/persistence_agent
COPY --from=builder /go/src/github.com/googleprivate/ml/third_party/license.txt /bin/license.txt
COPY --from=builder /go/src/github.com/kubeflow/pipelines/third_party/license.txt /bin/license.txt
RUN chmod +x /bin/persistence_agent

CMD persistence_agent \
Expand Down
4 changes: 2 additions & 2 deletions backend/Dockerfile.scheduledworkflow
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ RUN curl -fsSL -o /bin/dep https://github.com/golang/dep/releases/download/v0.5.
ADD https://github.com/golang/dep/releases/download/v0.5.0/dep-linux-amd64 /usr/bin/dep
RUN chmod +x /usr/bin/dep

WORKDIR /go/src/github.com/googleprivate/ml
WORKDIR /go/src/github.com/kubeflow/pipelines
COPY . .

# Needed for github.com/mattn/go-sqlite3
Expand All @@ -18,7 +18,7 @@ FROM alpine
WORKDIR /bin

COPY --from=builder /bin/controller /bin/controller
COPY --from=builder /go/src/github.com/googleprivate/ml/third_party/license.txt /bin/license.txt
COPY --from=builder /go/src/github.com/kubeflow/pipelines/third_party/license.txt /bin/license.txt
RUN chmod +x /bin/controller

CMD /bin/controller -alsologtostderr=true
Expand Down
2 changes: 1 addition & 1 deletion backend/src/crd/hack/update-codegen.sh
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,6 @@ CODEGEN_PKG=${SCRIPT_ROOT}/../../../../../../k8s.io/code-generator
echo "CODEGEN_PKG is $CODEGEN_PKG"

${CODEGEN_PKG}/generate-groups.sh "deepcopy,client,informer,lister" \
github.com/googleprivate/ml/backend/src/crd/pkg/client github.com/googleprivate/ml/backend/src/crd/pkg/apis \
github.com/kubeflow/pipelines/backend/src/crd/pkg/client github.com/kubeflow/pipelines/backend/src/crd/pkg/apis \
scheduledworkflow:v1alpha1 \
--go-header-file ${SCRIPT_ROOT}/hack/custom-boilerplate.go.txt
8 changes: 4 additions & 4 deletions developer_guide.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# ML Pipeline Development Guideline

This document describes the development guideline to contribute to ML pipeline project. Please check the [main page](https://github.com/googleprivate/ml/blob/master/README.md) for instruction on how to deploy a ML pipeline system.
This document describes the development guideline to contribute to ML pipeline project. Please check the [main page](https://github.com/kubeflow/pipelines/blob/master/README.md) for instruction on how to deploy a ML pipeline system.

## ML pipeline deployment

Expand All @@ -17,9 +17,9 @@ The docker container accepts various parameters to customize your deployment.
- **--report_usage** whether to report usage for the deployment
- **--uninstall** to uninstall everything.

See [bootstrapper.yaml](https://github.com/googleprivate/ml/blob/master/bootstrapper.yaml) for examples on how to pass in parameter.
See [bootstrapper.yaml](https://github.com/kubeflow/pipelines/blob/master/bootstrapper.yaml) for examples on how to pass in parameter.

Alternatively, you can use [deploy.sh](https://github.com/googleprivate/ml/blob/master/ml-pipeline/deploy.sh) if you want to interact with Ksonnet directly.
Alternatively, you can use [deploy.sh](https://github.com/kubeflow/pipelines/blob/master/ml-pipeline/deploy.sh) if you want to interact with Ksonnet directly.
To deploy, run the script locally.
```bash
$ ml-pipeline/deploy.sh
Expand Down Expand Up @@ -101,7 +101,7 @@ pip install ./dsl-compiler/ --upgrade && python ./dsl-compiler/tests/main.py
## Integration test

### API server
Check [this](https://github.com/googleprivate/ml/blob/master/test/apiserver/README.md) page for more details.
Check [this](https://github.com/kubeflow/pipelines/blob/master/test/apiserver/README.md) page for more details.

## E2E test
TODO: Add instruction
Expand Down
6 changes: 3 additions & 3 deletions frontend/server/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -22,12 +22,12 @@
},
"repository": {
"type": "git",
"url": "git+https://github.com/googleprivate/ml.git"
"url": "git+https://github.com/kubeflow/pipelines.git"
},
"author": "",
"license": "",
"bugs": {
"url": "https://github.com/googleprivate/ml/issues"
"url": "https://github.com/kubeflow/pipelines/issues"
},
"homepage": "https://github.com/googleprivate/ml#readme"
"homepage": "https://github.com/kubeflow/pipelines#readme"
}
2 changes: 1 addition & 1 deletion ml-pipeline/deploy.sh
Original file line number Diff line number Diff line change
Expand Up @@ -120,7 +120,7 @@ echo "Initialized ksonnet APP completed successfully"
# an known issue: https://github.com/ksonnet/ksonnet/issues/232, we are working around by creating
# a symbolic links in ./vendor and manually modifying app.yaml
# when the repo is public we can do following:
# ks registry add ml-pipeline github.com/googleprivate/ml/tree/master/ml-pipeline
# ks registry add ml-pipeline github.com/kubeflow/pipelines/tree/master/ml-pipeline
# ks pkg install ml-pipeline/ml-pipeline
BASEDIR=$(cd $(dirname "$0") && pwd)
ln -s ${BASEDIR} ${APP_DIR}/vendor/ml-pipeline
Expand Down
4 changes: 2 additions & 2 deletions ml-pipeline/ml-pipeline/parts.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -26,10 +26,10 @@
],
"repository": {
"type": "git",
"url": "https://github.com/googleprivate/ml"
"url": "https://github.com/kubeflow/pipelines"
},
"bugs": {
"url": "https://github.com/googleprivate/ml/issues"
"url": "https://github.com/kubeflow/pipelines/issues"
},
"keywords": [
"ml-pipeline"
Expand Down
4 changes: 2 additions & 2 deletions samples/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -71,7 +71,7 @@ Install [docker](https://www.docker.com/get-docker).

### Step One: Create A Container For Each Component
In most cases, you need to create your own container image that includes your program. You can find container
building examples from [here](https://github.com/googleprivate/ml/blob/master/components)(in the directory, go to any subdirectory and then go to “containers” directory).
building examples from [here](https://github.com/kubeflow/pipelines/blob/master/components)(in the directory, go to any subdirectory and then go to “containers” directory).

If your component creates some outputs to be fed as inputs to the downstream components, each output has
to be a string and needs to be written to a separate local text file by the container image.
Expand Down Expand Up @@ -155,4 +155,4 @@ args go first and keyword args go next.
should all be of that type. The default values will show up in the Pipeline UI but can be overwritten.


See an example [here](https://github.com/googleprivate/ml/blob/master/samples/xgboost-spark/xgboost-training-cm.py).
See an example [here](https://github.com/kubeflow/pipelines/blob/master/samples/xgboost-spark/xgboost-training-cm.py).
2 changes: 1 addition & 1 deletion samples/basic/README.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
## Compile
Follow [README.md](https://github.com/googleprivate/ml/blob/master/samples/README.md) to install the compiler and
Follow [README.md](https://github.com/kubeflow/pipelines/blob/master/samples/README.md) to install the compiler and
compile the sample python into workflow yaml.

"sequential.yaml" is pre-generated for referencing purpose.
Expand Down
18 changes: 9 additions & 9 deletions samples/kubeflow-tf/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ Note: The trainer depends on KubeFlow API Version v1alpha2.

## Compiling the pipeline template

Follow [README.md](https://github.com/googleprivate/ml/blob/master/samples/README.md) to install the compiler and then run the following command to compile the pipeline:
Follow [README.md](https://github.com/kubeflow/pipelines/blob/master/samples/README.md) to install the compiler and then run the following command to compile the pipeline:

```bash
dsl-compile --py kubeflow-training-classification.py --output kubeflow-training-classification.tar.gz
Expand All @@ -29,17 +29,17 @@ The pipeline will require one argument:
## Components Source

Preprocessing:
[source code](https://github.com/googleprivate/ml/tree/master/components/dataflow/tft),
[container](https://github.com/googleprivate/ml/tree/master/components/dataflow/containers/tft)
[source code](https://github.com/kubeflow/pipelines/tree/master/components/dataflow/tft),
[container](https://github.com/kubeflow/pipelines/tree/master/components/dataflow/containers/tft)

Training:
[source code](https://github.com/googleprivate/ml/tree/master/components/kubeflow/launcher),
[container](https://github.com/googleprivate/ml/tree/master/components/kubeflow/container/launcher)
[source code](https://github.com/kubeflow/pipelines/tree/master/components/kubeflow/launcher),
[container](https://github.com/kubeflow/pipelines/tree/master/components/kubeflow/container/launcher)

Prediction:
[source code](https://github.com/googleprivate/ml/tree/master/components/dataflow/predict),
[container](https://github.com/googleprivate/ml/tree/master/components/dataflow/containers/predict)
[source code](https://github.com/kubeflow/pipelines/tree/master/components/dataflow/predict),
[container](https://github.com/kubeflow/pipelines/tree/master/components/dataflow/containers/predict)

Confusion Matrix:
[source code](https://github.com/googleprivate/ml/tree/master/components/local/evaluation),
[container](https://github.com/googleprivate/ml/tree/master/components/local/containers/confusion_matrix)
[source code](https://github.com/kubeflow/pipelines/tree/master/components/local/evaluation),
[container](https://github.com/kubeflow/pipelines/tree/master/components/local/containers/confusion_matrix)
14 changes: 7 additions & 7 deletions samples/resnet-cmle/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ Training and serving uses Google Cloud Machine Learning Engine. So [Cloud Machin
given project.

## Compile
Follow [README.md](https://github.com/googleprivate/ml/blob/master/samples/README.md) to install the compiler and
Follow [README.md](https://github.com/kubeflow/pipelines/blob/master/samples/README.md) to install the compiler and
compile your python sample into workflow yaml.

## Deploy
Expand All @@ -25,13 +25,13 @@ bucket: A Google storage bucket to store results.
## Components Source

Preprocessing:
[source code](https://github.com/googleprivate/ml/tree/master/components/resnet-cmle/resnet)
[container](https://github.com/googleprivate/ml/tree/master/components/resnet-cmle/containers/preprocess)
[source code](https://github.com/kubeflow/pipelines/tree/master/components/resnet-cmle/resnet)
[container](https://github.com/kubeflow/pipelines/tree/master/components/resnet-cmle/containers/preprocess)

Training:
[source code](https://github.com/googleprivate/ml/tree/master/components/resnet-cmle/resnet)
[container](https://github.com/googleprivate/ml/tree/master/components/resnet-cmle/containers/train)
[source code](https://github.com/kubeflow/pipelines/tree/master/components/resnet-cmle/resnet)
[container](https://github.com/kubeflow/pipelines/tree/master/components/resnet-cmle/containers/train)

Deployment:
[source code](https://github.com/googleprivate/ml/tree/master/components/resnet-cmle/resnet)
[container](https://github.com/googleprivate/ml/tree/master/components/resnet-cmle/containers/deploy)
[source code](https://github.com/kubeflow/pipelines/tree/master/components/resnet-cmle/resnet)
[container](https://github.com/kubeflow/pipelines/tree/master/components/resnet-cmle/containers/deploy)
18 changes: 9 additions & 9 deletions samples/tfma/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ Instructions for enabling that can be found [here](https://cloud.google.com/endp

## Compiling the pipeline template

Follow [README.md](https://github.com/googleprivate/ml/blob/master/samples/README.md) to install the compiler and then run the following to compile the pipeline:
Follow [README.md](https://github.com/kubeflow/pipelines/blob/master/samples/README.md) to install the compiler and then run the following to compile the pipeline:

```bash
dsl-compile --py taxi-cab-classification-pipeline.py --output taxi-cab-classification-pipeline.tar.gz
Expand All @@ -49,17 +49,17 @@ The pipeline will require two arguments:
## Components Source

Preprocessing:
[source code](https://github.com/googleprivate/ml/tree/master/components/dataflow/tft)
[container](https://github.com/googleprivate/ml/tree/master/components/dataflow/containers/tft)
[source code](https://github.com/kubeflow/pipelines/tree/master/components/dataflow/tft)
[container](https://github.com/kubeflow/pipelines/tree/master/components/dataflow/containers/tft)

Training:
[source code](https://github.com/googleprivate/ml/tree/master/components/kubeflow/launcher)
[container](https://github.com/googleprivate/ml/tree/master/components/kubeflow/container/launcher)
[source code](https://github.com/kubeflow/pipelines/tree/master/components/kubeflow/launcher)
[container](https://github.com/kubeflow/pipelines/tree/master/components/kubeflow/container/launcher)

Analysis:
[source code](https://github.com/googleprivate/ml/tree/master/components/dataflow/tfma)
[container](https://github.com/googleprivate/ml/tree/master/components/dataflow/containers/tfma)
[source code](https://github.com/kubeflow/pipelines/tree/master/components/dataflow/tfma)
[container](https://github.com/kubeflow/pipelines/tree/master/components/dataflow/containers/tfma)

Prediction:
[source code](https://github.com/googleprivate/ml/tree/master/components/dataflow/predict)
[container](https://github.com/googleprivate/ml/tree/master/components/dataflow/containers/predict)
[source code](https://github.com/kubeflow/pipelines/tree/master/components/dataflow/predict)
[container](https://github.com/kubeflow/pipelines/tree/master/components/dataflow/containers/predict)
Loading