Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Release component image version a277f87ea1d4707bf860d080d06639b7caf9a1cf #1082

Merged
merged 2 commits into from
Apr 4, 2019
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion components/dataflow/predict/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ outputs:
- {name: Predictions dir, type: GCPPath, description: 'GCS or local directory.'} #Will contain prediction_results-* and schema.json files; TODO: Split outputs and replace dir with single file # type: {GCSPath: {path_type: Directory}}
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-dataflow-tf-predict:2c2445df83fa879387a200747cc20f72a7ee9727
image: gcr.io/ml-pipeline/ml-pipeline-dataflow-tf-predict:a277f87ea1d4707bf860d080d06639b7caf9a1cf
command: [python2, /ml/predict.py]
args: [
--data, {inputValue: Data file pattern},
Expand Down
2 changes: 1 addition & 1 deletion components/dataflow/tfdv/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ outputs:
- {name: Validation result, type: String, description: Indicates whether anomalies were detected or not.}
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-dataflow-tfdv:2c2445df83fa879387a200747cc20f72a7ee9727
image: gcr.io/ml-pipeline/ml-pipeline-dataflow-tfdv:a277f87ea1d4707bf860d080d06639b7caf9a1cf
command: [python2, /ml/validate.py]
args: [
--csv-data-for-inference, {inputValue: Inference data},
Expand Down
2 changes: 1 addition & 1 deletion components/dataflow/tfma/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ outputs:
- {name: Analysis results dir, type: GCPPath, description: GCS or local directory where the analysis results should were written.} # type: {GCSPath: {path_type: Directory}}
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-dataflow-tfma:2c2445df83fa879387a200747cc20f72a7ee9727
image: gcr.io/ml-pipeline/ml-pipeline-dataflow-tfma:a277f87ea1d4707bf860d080d06639b7caf9a1cf
command: [python2, /ml/model_analysis.py]
args: [
--model, {inputValue: Model},
Expand Down
2 changes: 1 addition & 1 deletion components/dataflow/tft/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ outputs:
- {name: Transformed data dir, type: GCPPath} # type: {GCSPath: {path_type: Directory}}
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-dataflow-tft:2c2445df83fa879387a200747cc20f72a7ee9727
image: gcr.io/ml-pipeline/ml-pipeline-dataflow-tft:a277f87ea1d4707bf860d080d06639b7caf9a1cf
command: [python2, /ml/transform.py]
args: [
--train, {inputValue: Training data file pattern},
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/bigquery/query/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@ outputs:
type: GCSPath
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:2c2445df83fa879387a200747cc20f72a7ee9727
image: gcr.io/ml-pipeline/ml-pipeline-gcp:a277f87ea1d4707bf860d080d06639b7caf9a1cf
args: [
kfp_component.google.bigquery, query,
--query, {inputValue: query},
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataflow/launch_python/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@ outputs:
type: String
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:2c2445df83fa879387a200747cc20f72a7ee9727
image: gcr.io/ml-pipeline/ml-pipeline-gcp:a277f87ea1d4707bf860d080d06639b7caf9a1cf
args: [
kfp_component.google.dataflow, launch_python,
--python_file_path, {inputValue: python_file_path},
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataflow/launch_template/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@ outputs:
type: String
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:2c2445df83fa879387a200747cc20f72a7ee9727
image: gcr.io/ml-pipeline/ml-pipeline-gcp:a277f87ea1d4707bf860d080d06639b7caf9a1cf
args: [
kfp_component.google.dataflow, launch_template,
--project_id, {inputValue: project_id},
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataproc/create_cluster/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,7 @@ outputs:
type: String
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:2c2445df83fa879387a200747cc20f72a7ee9727
image: gcr.io/ml-pipeline/ml-pipeline-gcp:a277f87ea1d4707bf860d080d06639b7caf9a1cf
args: [
kfp_component.google.dataproc, create_cluster,
--project_id, {inputValue: project_id},
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataproc/delete_cluster/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ inputs:
type: Integer
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:2c2445df83fa879387a200747cc20f72a7ee9727
image: gcr.io/ml-pipeline/ml-pipeline-gcp:a277f87ea1d4707bf860d080d06639b7caf9a1cf
args: [
kfp_component.google.dataproc, delete_cluster,
--project_id, {inputValue: project_id},
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataproc/submit_hadoop_job/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -75,7 +75,7 @@ outputs:
type: String
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:2c2445df83fa879387a200747cc20f72a7ee9727
image: gcr.io/ml-pipeline/ml-pipeline-gcp:a277f87ea1d4707bf860d080d06639b7caf9a1cf
args: [
kfp_component.google.dataproc, submit_hadoop_job,
--project_id, {inputValue: project_id},
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataproc/submit_hive_job/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,7 @@ outputs:
type: String
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:2c2445df83fa879387a200747cc20f72a7ee9727
image: gcr.io/ml-pipeline/ml-pipeline-gcp:a277f87ea1d4707bf860d080d06639b7caf9a1cf
args: [
kfp_component.google.dataproc, submit_hive_job,
--project_id, {inputValue: project_id},
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataproc/submit_pig_job/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,7 @@ outputs:
type: String
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:2c2445df83fa879387a200747cc20f72a7ee9727
image: gcr.io/ml-pipeline/ml-pipeline-gcp:a277f87ea1d4707bf860d080d06639b7caf9a1cf
args: [
kfp_component.google.dataproc, submit_pig_job,
--project_id, {inputValue: project_id},
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataproc/submit_pyspark_job/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,7 @@ outputs:
type: String
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:2c2445df83fa879387a200747cc20f72a7ee9727
image: gcr.io/ml-pipeline/ml-pipeline-gcp:a277f87ea1d4707bf860d080d06639b7caf9a1cf
args: [
kfp_component.google.dataproc, submit_pyspark_job,
--project_id, {inputValue: project_id},
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataproc/submit_spark_job/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -71,7 +71,7 @@ outputs:
type: String
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:2c2445df83fa879387a200747cc20f72a7ee9727
image: gcr.io/ml-pipeline/ml-pipeline-gcp:a277f87ea1d4707bf860d080d06639b7caf9a1cf
args: [
kfp_component.google.dataproc, submit_spark_job,
--project_id, {inputValue: project_id},
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataproc/submit_sparksql_job/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,7 @@ outputs:
type: String
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:2c2445df83fa879387a200747cc20f72a7ee9727
image: gcr.io/ml-pipeline/ml-pipeline-gcp:a277f87ea1d4707bf860d080d06639b7caf9a1cf
args: [
kfp_component.google.dataproc, submit_sparksql_job,
--project_id, {inputValue: project_id},
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/ml_engine/batch_predict/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,7 @@ outputs:
type: String
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:2c2445df83fa879387a200747cc20f72a7ee9727
image: gcr.io/ml-pipeline/ml-pipeline-gcp:a277f87ea1d4707bf860d080d06639b7caf9a1cf
args: [
kfp_component.google.ml_engine, batch_predict,
--project_id, {inputValue: project_id},
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/ml_engine/deploy/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -79,7 +79,7 @@ outputs:
type: String
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:2c2445df83fa879387a200747cc20f72a7ee9727
image: gcr.io/ml-pipeline/ml-pipeline-gcp:a277f87ea1d4707bf860d080d06639b7caf9a1cf
args: [
kfp_component.google.ml_engine, deploy,
--model_uri, {inputValue: model_uri},
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/ml_engine/train/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -91,7 +91,7 @@ outputs:
type: GCSPath
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:2c2445df83fa879387a200747cc20f72a7ee9727
image: gcr.io/ml-pipeline/ml-pipeline-gcp:a277f87ea1d4707bf860d080d06639b7caf9a1cf
args: [
kfp_component.google.ml_engine, train,
--project_id, {inputValue: project_id},
Expand Down
2 changes: 1 addition & 1 deletion components/kubeflow/deployer/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ inputs:
# - {name: Endppoint URI, type: Serving URI, description: 'URI of the deployed prediction service..'}
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-kubeflow-deployer:2c2445df83fa879387a200747cc20f72a7ee9727
image: gcr.io/ml-pipeline/ml-pipeline-kubeflow-deployer:a277f87ea1d4707bf860d080d06639b7caf9a1cf
command: [/bin/deploy.sh]
args: [
--model-export-path, {inputValue: Model dir},
Expand Down
2 changes: 1 addition & 1 deletion components/kubeflow/dnntrainer/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ outputs:
- {name: Training output dir, type: GCPPath, description: 'GCS or local directory.'} # type: {GCSPath: {path_type: Directory}}
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-kubeflow-tf-trainer:2c2445df83fa879387a200747cc20f72a7ee9727
image: gcr.io/ml-pipeline/ml-pipeline-kubeflow-tf-trainer:a277f87ea1d4707bf860d080d06639b7caf9a1cf
command: [python2, -m, trainer.task]
args: [
--transformed-data-dir, {inputValue: Transformed data dir},
Expand Down
2 changes: 1 addition & 1 deletion components/kubeflow/launcher/kubeflow_tfjob_launcher_op.py
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@
def kubeflow_tfjob_launcher_op(container_image, command, number_of_workers: int, number_of_parameter_servers: int, tfjob_timeout_minutes: int, output_dir=None, step_name='TFJob-launcher'):
return dsl.ContainerOp(
name = step_name,
image = 'gcr.io/ml-pipeline/ml-pipeline-kubeflow-tf:2c2445df83fa879387a200747cc20f72a7ee9727',
image = 'gcr.io/ml-pipeline/ml-pipeline-kubeflow-tf:a277f87ea1d4707bf860d080d06639b7caf9a1cf',
arguments = [
'--workers', number_of_workers,
'--pss', number_of_parameter_servers,
Expand Down
6 changes: 3 additions & 3 deletions components/kubeflow/launcher/src/train.template.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ spec:
spec:
containers:
- name: tensorflow
image: gcr.io/ml-pipeline/ml-pipeline-kubeflow-tf-trainer:2c2445df83fa879387a200747cc20f72a7ee9727
image: gcr.io/ml-pipeline/ml-pipeline-kubeflow-tf-trainer:a277f87ea1d4707bf860d080d06639b7caf9a1cf
command:
- python
- -m
Expand All @@ -49,7 +49,7 @@ spec:
spec:
containers:
- name: tensorflow
image: gcr.io/ml-pipeline/ml-pipeline-kubeflow-tf-trainer:2c2445df83fa879387a200747cc20f72a7ee9727
image: gcr.io/ml-pipeline/ml-pipeline-kubeflow-tf-trainer:a277f87ea1d4707bf860d080d06639b7caf9a1cf
command:
- python
- -m
Expand All @@ -72,7 +72,7 @@ spec:
spec:
containers:
- name: tensorflow
image: gcr.io/ml-pipeline/ml-pipeline-kubeflow-tf-trainer:2c2445df83fa879387a200747cc20f72a7ee9727
image: gcr.io/ml-pipeline/ml-pipeline-kubeflow-tf-trainer:a277f87ea1d4707bf860d080d06639b7caf9a1cf
command:
- python
- -m
Expand Down
2 changes: 1 addition & 1 deletion components/local/confusion_matrix/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ inputs:
# - {name: Metrics, type: Metrics}
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-local-confusion-matrix:2c2445df83fa879387a200747cc20f72a7ee9727
image: gcr.io/ml-pipeline/ml-pipeline-local-confusion-matrix:a277f87ea1d4707bf860d080d06639b7caf9a1cf
command: [python2, /ml/confusion_matrix.py]
args: [
--predictions, {inputValue: Predictions},
Expand Down
2 changes: 1 addition & 1 deletion components/local/roc/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ inputs:
# - {name: Metrics, type: Metrics}
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-local-confusion-matrix:2c2445df83fa879387a200747cc20f72a7ee9727
image: gcr.io/ml-pipeline/ml-pipeline-local-confusion-matrix:a277f87ea1d4707bf860d080d06639b7caf9a1cf
command: [python2, /ml/confusion_matrix.py]
args: [
--predictions, {inputValue: Predictions dir},
Expand Down
2 changes: 1 addition & 1 deletion samples/kubeflow-tf/kubeflow-training-classification.py
Original file line number Diff line number Diff line change
Expand Up @@ -66,7 +66,7 @@ def kubeflow_training(output, project,
).apply(gcp.use_gcp_secret('user-gcp-sa'))

if use_gpu:
training.image = 'gcr.io/ml-pipeline/ml-pipeline-kubeflow-tf-trainer-gpu:2c2445df83fa879387a200747cc20f72a7ee9727',
training.image = 'gcr.io/ml-pipeline/ml-pipeline-kubeflow-tf-trainer-gpu:a277f87ea1d4707bf860d080d06639b7caf9a1cf',
training.set_gpu_limit(1)

prediction = dataflow_tf_predict_op(
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -44,13 +44,13 @@
"EVAL_DATA = 'gs://ml-pipeline-playground/tfx/taxi-cab-classification/eval.csv'\n",
"HIDDEN_LAYER_SIZE = '1500'\n",
"STEPS = 3000\n",
"DATAFLOW_TFDV_IMAGE = 'gcr.io/ml-pipeline/ml-pipeline-dataflow-tfdv:2c2445df83fa879387a200747cc20f72a7ee9727'\n",
"DATAFLOW_TFT_IMAGE = 'gcr.io/ml-pipeline/ml-pipeline-dataflow-tft:2c2445df83fa879387a200747cc20f72a7ee9727'\n",
"DATAFLOW_TFMA_IMAGE = 'gcr.io/ml-pipeline/ml-pipeline-dataflow-tfma:2c2445df83fa879387a200747cc20f72a7ee9727'\n",
"DATAFLOW_TF_PREDICT_IMAGE = 'gcr.io/ml-pipeline/ml-pipeline-dataflow-tf-predict:2c2445df83fa879387a200747cc20f72a7ee9727'\n",
"KUBEFLOW_TF_TRAINER_IMAGE = 'gcr.io/ml-pipeline/ml-pipeline-kubeflow-tf-trainer:2c2445df83fa879387a200747cc20f72a7ee9727'\n",
"KUBEFLOW_TF_TRAINER_GPU_IMAGE = 'gcr.io/ml-pipeline/ml-pipeline-kubeflow-tf-trainer-gpu:2c2445df83fa879387a200747cc20f72a7ee9727'\n",
"KUBEFLOW_DEPLOYER_IMAGE = 'gcr.io/ml-pipeline/ml-pipeline-kubeflow-deployer:2c2445df83fa879387a200747cc20f72a7ee9727'\n",
"DATAFLOW_TFDV_IMAGE = 'gcr.io/ml-pipeline/ml-pipeline-dataflow-tfdv:a277f87ea1d4707bf860d080d06639b7caf9a1cf'\n",
"DATAFLOW_TFT_IMAGE = 'gcr.io/ml-pipeline/ml-pipeline-dataflow-tft:a277f87ea1d4707bf860d080d06639b7caf9a1cf'\n",
"DATAFLOW_TFMA_IMAGE = 'gcr.io/ml-pipeline/ml-pipeline-dataflow-tfma:a277f87ea1d4707bf860d080d06639b7caf9a1cf'\n",
"DATAFLOW_TF_PREDICT_IMAGE = 'gcr.io/ml-pipeline/ml-pipeline-dataflow-tf-predict:a277f87ea1d4707bf860d080d06639b7caf9a1cf'\n",
"KUBEFLOW_TF_TRAINER_IMAGE = 'gcr.io/ml-pipeline/ml-pipeline-kubeflow-tf-trainer:a277f87ea1d4707bf860d080d06639b7caf9a1cf'\n",
"KUBEFLOW_TF_TRAINER_GPU_IMAGE = 'gcr.io/ml-pipeline/ml-pipeline-kubeflow-tf-trainer-gpu:a277f87ea1d4707bf860d080d06639b7caf9a1cf'\n",
"KUBEFLOW_DEPLOYER_IMAGE = 'gcr.io/ml-pipeline/ml-pipeline-kubeflow-deployer:a277f87ea1d4707bf860d080d06639b7caf9a1cf'\n",
"DEPLOYER_MODEL = 'notebook_tfx_taxi'\n",
"DEPLOYER_VERSION_DEV = 'dev'\n",
"DEPLOYER_VERSION_PROD = 'prod'\n",
Expand Down
6 changes: 3 additions & 3 deletions samples/resnet-cmle/resnet-train-pipeline.py
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ def resnet_preprocess_op(project_id: 'GcpProject', output: 'GcsUri', train_csv:
validation_csv: 'GcsUri[text/csv]', labels, step_name='preprocess'):
return dsl.ContainerOp(
name = step_name,
image = 'gcr.io/ml-pipeline/resnet-preprocess:2c2445df83fa879387a200747cc20f72a7ee9727',
image = 'gcr.io/ml-pipeline/resnet-preprocess:a277f87ea1d4707bf860d080d06639b7caf9a1cf',
arguments = [
'--project_id', project_id,
'--output', output,
Expand All @@ -38,7 +38,7 @@ def resnet_train_op(data_dir, output: 'GcsUri', region: 'GcpRegion', depth: int,
num_eval_images: int, num_label_classes: int, tf_version, step_name='train'):
return dsl.ContainerOp(
name = step_name,
image = 'gcr.io/ml-pipeline/resnet-train:2c2445df83fa879387a200747cc20f72a7ee9727',
image = 'gcr.io/ml-pipeline/resnet-train:a277f87ea1d4707bf860d080d06639b7caf9a1cf',
arguments = [
'--data_dir', data_dir,
'--output', output,
Expand All @@ -60,7 +60,7 @@ def resnet_deploy_op(model_dir, model, version, project_id: 'GcpProject', region
tf_version, step_name='deploy'):
return dsl.ContainerOp(
name = step_name,
image = 'gcr.io/ml-pipeline/resnet-deploy:2c2445df83fa879387a200747cc20f72a7ee9727',
image = 'gcr.io/ml-pipeline/resnet-deploy:a277f87ea1d4707bf860d080d06639b7caf9a1cf',
arguments = [
'--model', model,
'--version', version,
Expand Down
16 changes: 8 additions & 8 deletions samples/tfx/taxi-cab-classification-pipeline.py
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@
def dataflow_tf_data_validation_op(inference_data: 'GcsUri', validation_data: 'GcsUri', column_names: 'GcsUri[text/json]', key_columns, project: 'GcpProject', mode, validation_output: 'GcsUri[Directory]', step_name='validation'):
return dsl.ContainerOp(
name = step_name,
image = 'gcr.io/ml-pipeline/ml-pipeline-dataflow-tfdv:2c2445df83fa879387a200747cc20f72a7ee9727',
image = 'gcr.io/ml-pipeline/ml-pipeline-dataflow-tfdv:a277f87ea1d4707bf860d080d06639b7caf9a1cf',
arguments = [
'--csv-data-for-inference', inference_data,
'--csv-data-to-validate', validation_data,
Expand All @@ -40,7 +40,7 @@ def dataflow_tf_data_validation_op(inference_data: 'GcsUri', validation_data: 'G
def dataflow_tf_transform_op(train_data: 'GcsUri', evaluation_data: 'GcsUri', schema: 'GcsUri[text/json]', project: 'GcpProject', preprocess_mode, preprocess_module: 'GcsUri[text/code/python]', transform_output: 'GcsUri[Directory]', step_name='preprocess'):
return dsl.ContainerOp(
name = step_name,
image = 'gcr.io/ml-pipeline/ml-pipeline-dataflow-tft:2c2445df83fa879387a200747cc20f72a7ee9727',
image = 'gcr.io/ml-pipeline/ml-pipeline-dataflow-tft:a277f87ea1d4707bf860d080d06639b7caf9a1cf',
arguments = [
'--train', train_data,
'--eval', evaluation_data,
Expand All @@ -57,7 +57,7 @@ def dataflow_tf_transform_op(train_data: 'GcsUri', evaluation_data: 'GcsUri', sc
def tf_train_op(transformed_data_dir, schema: 'GcsUri[text/json]', learning_rate: float, hidden_layer_size: int, steps: int, target: str, preprocess_module: 'GcsUri[text/code/python]', training_output: 'GcsUri[Directory]', step_name='training'):
return dsl.ContainerOp(
name = step_name,
image = 'gcr.io/ml-pipeline/ml-pipeline-kubeflow-tf-trainer:2c2445df83fa879387a200747cc20f72a7ee9727',
image = 'gcr.io/ml-pipeline/ml-pipeline-kubeflow-tf-trainer:a277f87ea1d4707bf860d080d06639b7caf9a1cf',
arguments = [
'--transformed-data-dir', transformed_data_dir,
'--schema', schema,
Expand All @@ -74,7 +74,7 @@ def tf_train_op(transformed_data_dir, schema: 'GcsUri[text/json]', learning_rate
def dataflow_tf_model_analyze_op(model: 'TensorFlow model', evaluation_data: 'GcsUri', schema: 'GcsUri[text/json]', project: 'GcpProject', analyze_mode, analyze_slice_column, analysis_output: 'GcsUri', step_name='analysis'):
return dsl.ContainerOp(
name = step_name,
image = 'gcr.io/ml-pipeline/ml-pipeline-dataflow-tfma:2c2445df83fa879387a200747cc20f72a7ee9727',
image = 'gcr.io/ml-pipeline/ml-pipeline-dataflow-tfma:a277f87ea1d4707bf860d080d06639b7caf9a1cf',
arguments = [
'--model', model,
'--eval', evaluation_data,
Expand All @@ -91,7 +91,7 @@ def dataflow_tf_model_analyze_op(model: 'TensorFlow model', evaluation_data: 'Gc
def dataflow_tf_predict_op(evaluation_data: 'GcsUri', schema: 'GcsUri[text/json]', target: str, model: 'TensorFlow model', predict_mode, project: 'GcpProject', prediction_output: 'GcsUri', step_name='prediction'):
return dsl.ContainerOp(
name = step_name,
image = 'gcr.io/ml-pipeline/ml-pipeline-dataflow-tf-predict:2c2445df83fa879387a200747cc20f72a7ee9727',
image = 'gcr.io/ml-pipeline/ml-pipeline-dataflow-tf-predict:a277f87ea1d4707bf860d080d06639b7caf9a1cf',
arguments = [
'--data', evaluation_data,
'--schema', schema,
Expand All @@ -108,7 +108,7 @@ def dataflow_tf_predict_op(evaluation_data: 'GcsUri', schema: 'GcsUri[text/json]
def confusion_matrix_op(predictions: 'GcsUri', output: 'GcsUri', step_name='confusion_matrix'):
return dsl.ContainerOp(
name=step_name,
image='gcr.io/ml-pipeline/ml-pipeline-local-confusion-matrix:2c2445df83fa879387a200747cc20f72a7ee9727',
image='gcr.io/ml-pipeline/ml-pipeline-local-confusion-matrix:a277f87ea1d4707bf860d080d06639b7caf9a1cf',
arguments=[
'--output', '%s/{{workflow.name}}/confusionmatrix' % output,
'--predictions', predictions,
Expand All @@ -119,7 +119,7 @@ def confusion_matrix_op(predictions: 'GcsUri', output: 'GcsUri', step_name='conf
def roc_op(predictions: 'GcsUri', output: 'GcsUri', step_name='roc'):
return dsl.ContainerOp(
name=step_name,
image='gcr.io/ml-pipeline/ml-pipeline-local-roc:2c2445df83fa879387a200747cc20f72a7ee9727',
image='gcr.io/ml-pipeline/ml-pipeline-local-roc:a277f87ea1d4707bf860d080d06639b7caf9a1cf',
arguments=[
'--output', '%s/{{workflow.name}}/roc' % output,
'--predictions', predictions,
Expand All @@ -130,7 +130,7 @@ def roc_op(predictions: 'GcsUri', output: 'GcsUri', step_name='roc'):
def kubeflow_deploy_op(model: 'TensorFlow model', tf_server_name, step_name='deploy'):
return dsl.ContainerOp(
name = step_name,
image = 'gcr.io/ml-pipeline/ml-pipeline-kubeflow-deployer:2c2445df83fa879387a200747cc20f72a7ee9727',
image = 'gcr.io/ml-pipeline/ml-pipeline-kubeflow-deployer:a277f87ea1d4707bf860d080d06639b7caf9a1cf',
arguments = [
'--model-export-path', '%s/export/export' % model,
'--server-name', tf_server_name
Expand Down
Loading