-
Notifications
You must be signed in to change notification settings - Fork 1.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
unable to use use_gcp_secret with value from PipelineParam #2089
Comments
I also get the same results with the last variant debug_task = debug_op().apply(gcp.use_gcp_secret(str(secret))) with kfp SDK 0.1.29 and Kubeflow 0.6.2 (which I think has the pipeline components installed as 0.1.23, and argo 2.3.0). |
Hi Matt, Here is a sample for general purpose which I think you already know. May I know more info on why you handle over gcp_secret from PipelineParam? |
One more question: if you target for workload authorization isolation, would "Workload Identity" be better? You bind pipeline/component to a KSA which is bind to a GSA. |
Please try the latest SDK. This issue in Argo was inadvertently fixed by The small issue still remains - |
The former - to run the same pipeline code with different configuration (secret, output paths, etc). |
I see. We're trying to make the usage of secrets unnecessary as we move towards workflow identity support where the access rights are determined by the permissions of the service account that's used for the pipeline run. |
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions. |
This issue has been automatically closed because it has not had recent activity. Please comment "/reopen" to reopen it. |
What happened:
I am trying to write a pipeline where the secret name used with
gcp.use_gcp_secret
is a pipeline param defined at runtime.I've tried a few variants, here is the main pipeline code:
since the compiler invokes the pipeline function with parameters that are
kfp.dsl.PipelineParam
instances, this first version fails with:if I change how the
secret
param name/value is referenced to bethis also fails at the compile stage since the compiler invokes the function with PipelineParams whose value are
None
:Finally, if I try
then the pipeline compiles ok, but fails when it is run with:
the generated Argo workflow looks like the following - the kfp Compiler seems to recognize that the
secret
name should be parameterized, but it seems as if Argo does not fully understand how to find a volume where the name is parameterized.What did you expect to happen:
To be able to parameterize the name of a secret (containing GCP service account credentials) to run a pipeline or component as.
Anything else you would like to add:
Tested with Kubeflow 0.5 and kfp python SDK 0.1.24.
The text was updated successfully, but these errors were encountered: