-
Notifications
You must be signed in to change notification settings - Fork 63
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
setting up dependency is not working between schema and volume #1977
Labels
DABs
DABs related issues
Comments
Hi @hariaculeti, thanks for reaching out. You can set a dependency by using the following syntax (
Meanwhile I'm working on automatically detecting this and showing a warning if we detect that a dependency is not setup. |
Hi Shreyas,
This is working like a charm. Was not aware of setting dependency like that. Thanks a lot for your help. That's all I need for now..
Regards,
Hari.
From: shreyas-goenka ***@***.***>
Sent: Monday, 9 December 2024 6:03 PM
To: databricks/cli ***@***.***>
Cc: Aculeti, Hari (RTX) ***@***.***>; Mention ***@***.***>
Subject: [External] Re: [databricks/cli] setting up dependency is not working between schema and volume (Issue #1977)
Hi @hariaculeti<https://github.com/hariaculeti>, thanks for reaching out. You can set a dependency by using the following syntax (${resources.schemas...}) in your volume:
resources:
schemas:
my_schema:
name: dev_schema
catalog_name: dev_catalog
volumes:
my_volume:
name: dev_vol
schema_name: ${resources.schemas.my_schema.name}
catalog_name: dev_catalog
Meanwhile I'm working on automatically detecting this and showing a warning if we detect that a dependency is not setup.
-
Reply to this email directly, view it on GitHub<#1977 (comment)>, or unsubscribe<https://github.com/notifications/unsubscribe-auth/AOWSNKXDDV7I5SAMO45XQCL2EVFEXAVCNFSM6AAAAABTFPT5B2VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDKMRXGIYDGMBQGQ>.
You are receiving this because you were mentioned.Message ID: ***@***.******@***.***>>
|
github-merge-queue bot
pushed a commit
that referenced
this issue
Jan 16, 2025
…1989) ## Changes Fixes #1977. This PR modifies the bundle configuration to capture the dependency that a UC Volume or a DLT pipeline might have on a UC schema at deployment time. It does so by replacing the schema name with a reference of the form `${resources.schemas.foo.name}`. For example: The following UC Volume definition depends on the UC schema with the name `schema_name`. This mutator converts this configuration from: ``` resources: volumes: bar: catalog_name: catalog_name name: volume_name schema_name: schema_name schemas: foo: catalog_name: catalog_name name: schema_name ``` to: ``` resources: volumes: bar: catalog_name: catalog_name name: volume_name schema_name: ${resources.schemas.foo.name}` schemas: foo: catalog_name: catalog_name name: schema_name ``` ## Tests Unit tests and manually.
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Describe the issue
There is no way to setup dependency between schema and volume. For ex: if I am creating schema and volume in same DAB and try to deploy, it will fail on volume creation saying schema doesnt exist. Only work around is to try to deploy the DAB again and this time it works becz schema got created in earlier deployment. But this is not ideal solution.
Configuration
resources:
schemas:
my_schema:
name: dev_schema
catalog_name: dev_catalog
volumes:
my_volume:
name: dev_vol
schema_name: dev_schema
catalog_name: dev_catalog
Steps to reproduce the behavior
Please list the steps required to reproduce the issue, for example:
Expected Behavior
Either DAB should internally set dependency on its own (like terraform does) or we should give option to user to set dependency manually. Either way works otherwise we need to do lot more work just to overcome dependency.
Actual Behavior
"Error: cannot create volume: Schema 'dev_catalog.dev_schema' does not exist."
OS and CLI version
Windows and Databricks CLI v0.236.0
Is this a regression?
No. Volume is just a new feature that got added recently.
Debug Logs
Not needed as error is easily reproducable.
The text was updated successfully, but these errors were encountered: