Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Document preparing external locations when creating catalogs #2915

Merged
merged 3 commits into from
Oct 10, 2024

Conversation

JCZuurmond
Copy link
Member

Changes

Document preparing external locations when creating catalogs.

Functionality

  • update README.md

@JCZuurmond JCZuurmond added the documentation Improvements or additions to documentation label Oct 10, 2024
@JCZuurmond JCZuurmond self-assigned this Oct 10, 2024
@JCZuurmond JCZuurmond requested a review from a team as a code owner October 10, 2024 11:08
[`create-catalogs-schemas` command](#create-catalogs-schemas-command). The command will create the UC catalogs and
schemas based on the mapping file created in the previous step.

This step requires considering how to [physically separate data in storage](https://docs.databricks.com/en/data-governance/unity-catalog/best-practices.html#data-is-physically-separated-in-storage)
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@biswadeepupadhyay-db: Does this clarify what you were missing?

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok so as long as we are not going ahead with the default metastore path (used at the time of UC setup) it is recommended that we create external storage and storage credential for the metastore paths we aim to use for our catalogs.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The later is generally advised by Databricks (according to the linked docs).

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also please confirm if step 2.1 till step 2.3 are related to finding the external locations from the assessment related to hive external tables and creating corresponding external location and storage credentials related to those external paths.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

that is correct

Copy link

✅ 29/29 passed, 51m6s total

Running from acceptance #6604

Copy link
Collaborator

@nfx nfx left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

@nfx nfx merged commit 3a5bce8 into main Oct 10, 2024
7 checks passed
@nfx nfx deleted the docs/explain-external-locations-for-catalogs branch October 10, 2024 11:29
nfx added a commit that referenced this pull request Oct 10, 2024
* Added `google-cloud-storage` to known list ([#2827](#2827)). In this release, we have added the `google-cloud-storage` library, along with its various modules and sub-modules, to our project's known list in a JSON file. Additionally, we have included the `google-crc32c` and `google-resumable-media` libraries. These libraries provide functionalities such as content addressable storage, checksum calculation, and resumable media upload and download. This change is a partial resolution to issue [#1931](#1931), which is likely related to the integration or usage of these libraries in the project. Software engineers should take note of these additions and how they may impact the project's functionality.
* Added `google-crc32c` to known list ([#2828](#2828)). With this commit, we have added the `google-crc32c` library to our system's known list, addressing part of issue [#1931](#1931). This addition enhances the overall functionality of the system by providing efficient and high-speed CRC32C computation when utilized. The `google-crc32c` library is known for its performance and reliability, and by incorporating it into our system, we aim to improve the efficiency and robustness of the CRC32C computation process. This enhancement is part of our ongoing efforts to optimize the system and ensure a more efficient experience for our end-users. With this change, users can expect faster and more reliable CRC32C computations in their applications.
* Added `holidays` to known list ([#2906](#2906)). In this release, we have expanded the known list in our open-source library to include a new `holidays` category, aimed at supporting tracking of holidays for different countries, religions, and financial institutions. This category includes several subcategories, such as calendars, countries, deprecation, financial holidays, groups, helpers, holiday base, mixins, observed holiday base, registry, and utils. Each subcategory contains an empty list, allowing for future data storage related to holidays. This change partially resolves issue [#1931](#1931), and represents a significant step towards supporting a more comprehensive range of holiday tracking needs in our library. Software engineers may utilize this new feature to build applications that require tracking and management of various holidays and related data.
* Added `htmlmin` to known list ([#2907](#2907)). In this update, we have added the `htmlmin` library to the `known.json` configuration file's list of known libraries. This addition enables the use and management of `htmlmin` and its components, including `htmlmin.command`, `htmlmin.decorator`, `htmlmin.escape`, `htmlmin.main`, `htmlmin.middleware`, `htmlmin.parser`, `htmlmin.python3html`, and `htmlmin.python3html.parser`. This change partially addresses issue [#1931](#1931), which may have been caused by the integration or usage of `htmlmin`. Software engineers can now utilize `htmlmin` and its features in their projects, thanks to this enhancement.
* Document preparing external locations when creating catalogs ([#2915](#2915)). Databricks Labs' UCX tool has been updated to incorporate the preparation of external locations when creating catalogs during the upgrade to Unity Catalog (UC). This enhancement involves the addition of new documentation outlining how to physically separate data in storage within UC, adhering to Databricks' best practices. The `create-catalogs-schemas` command has been updated to create UC catalogs and schemas based on a mapping file, allowing users to reuse previously created external locations or establish new ones outside of UCX. For data separation, users can leverage external locations when using subpaths, providing flexibility in data management during the upgrade process.
* Fixed `KeyError` from `assess_workflows` task ([#2919](#2919)). In this release, we have made significant improvements to error handling in our open-source library. We have fixed a KeyError in the `assess_workflows` task and modified the `_safe_infer_internal` and `_unsafe_infer_internal` methods to handle both `InferenceError` and `KeyError` during inference. When an error occurs, we now log the error message with the node and yield a `Uninferable` object. Additionally, we have updated the `do_infer_values` method of the `_LocalInferredValue` class to yield an iterator of iterables of `NodeNG` objects. We have added multiple unit tests for inferring values in Python code, including cases for handling externally defined values and their absence. These changes ensure that our library can handle errors more gracefully and provide more informative feedback during inference, making it more robust and easier to use in software engineering projects.
* Fixed `OSError: [Errno 95]` bug in `assess_workflows` task by skipping GIT-sourced workflows from static code analysis ([#2924](#2924)). In this release, we have resolved the `OSError: [Errno 95]` bug in the `assess_workflows` task that occurred while performing static code analysis on GIT-sourced workflows. A new attribute `Source` has been introduced in the `jobs` module of the `databricks.sdk.service` package to identify the source of a notebook task. If the notebook task source is GIT, a new `DependencyProblem` is raised, indicating that notebooks in GIT should be analyzed using the `databricks labs ucx lint-local-code` CLI command. The `_register_notebook` method has been updated to check if the notebook task source is GIT and return an appropriate `DependencyProblem` message. This change enhances the reliability of the `assess_workflows` task by avoiding the aforementioned bug and provides a more informative message when notebooks are sourced from GIT. This change is part of our ongoing effort to improve the project's quality and reliability and benefits software engineers who adopt the project.
* Fixed absolute path normalisation in source code analysis ([#2920](#2920)). In this release, we have addressed an issue with the Workspace API not supporting relative subpaths such as "/a/b/../c", which has been resolved by resolving workspace paths before calling the API. This fix is backward compatible and ensures the correct behavior of the source code analysis. Additionally, we have added integration tests and co-authored this commit with Eric Vergnaud and Serge Smertin. Furthermore, we have added a new test case that supports relative grand-parent paths in the dependency graph construction, utilizing a new `NotebookLoader` class. This loader is responsible for loading the notebook content and metadata given a path, and this new test case exercises the path resolution logic when a notebook depends on another notebook located two levels up in the directory hierarchy. These changes improve the robustness and reliability of the source code analysis in the presence of relative paths.
* Fixed downloading wheel libraries from DBFS on mounted Azure Storage fail with access denied ([#2918](#2918)). In this release, we have introduced enhancements to the library's handling of registering and downloading wheel libraries from DBFS on mounted Azure Storage, addressing an issue that resulted in access denied errors. The changes include improved error handling with the addition of a `try-except` block to handle potential `BadRequest` exceptions and the inclusion of three new methods to register different types of libraries. The `_register_requirements_txt` method reads requirements files and registers each library specified in the file, logging a warning message for any references to other requirements or constraints files. The `_register_whl` method creates a temporary copy of the given wheel file in the local file system and registers it, while the `_register_egg` method checks the runtime version and yields a `DependencyProblem` if the version is greater than (14, 0). These changes simplify the code and enhance error handling while addressing the reported issues related to registering libraries. The changes are implemented in the `jobs.py` file located in the `databricks/labs/ucx/source_code` directory, which also includes the import of the `BadRequest` exception class from `databricks.sdk.errors`.
* Fixed issue with migrating MANAGED hive_metastore table to UC ([#2892](#2892)). In this release, we have implemented changes to address the issue of migrating HMS (Hive Metastore) managed tables to UC (Unity Catalog) as EXTERNAL. Historically, deleting a managed table also removed the underlying data, leading to potential data loss and making the UC table unusable. The new approach provides options to mitigate these issues, including migrating as EXTERNAL or cloning the data to maintain integrity. These changes aim to prevent accidental data deletion, ensure data recoverability, and avoid inconsistencies when new data is added to either HMS or UC. We have introduced new class attributes, methods, and parameters in relevant modules such as `WorkspaceConfig`, `Table`, `migrate_tables`, and `install.py`. These modifications support the new migration strategies and allow for more flexibility in managing how tables are migrated and how data is handled. The upgrade process can be triggered using the `migrate-tables` UCX command or by running the table migration workflows deployed to the workspace. Thorough testing and documentation have been performed to minimize risks of data inconsistencies during migration. It is crucial to understand the implications of these changes and carefully consider the trade-offs before migrating managed tables to UC as EXTERNAL.
* Improve creating UC catalogs ([#2898](#2898)). In this release, the process of creating Unity Catalog (UC) catalogs has been significantly improved with the resolution of several issues discovered during manual testing. The `databricks labs ucx create-ucx-catalog/create-catalogs-schemas` command has been updated to ensure a better user experience and enhance consistency. Changes include requesting the catalog location even if the catalog already exists, eliminating multiple loops over storage locations, and improving logging and matching storage locations. The code now includes new checks to avoid requesting a catalog's storage location if it already exists and updates the behavior of the `_create_catalog_validate` and `_validate_location` methods. Additionally, new unit tests have been added to verify these changes. Under the hood, a new method, `get_catalog`, has been introduced to the `WorkspaceClient` class, and several test functions, such as `test_create_ucx_catalog_skips_when_ucx_catalogs_exists` and `test_create_all_catalogs_schemas_creates_catalogs`, have been implemented to ensure the proper functioning of the updated command. This release addresses issue [#2879](#2879) and enhances the overall process of creating UC catalogs, making it more efficient and reliable.
* Improve logging when skipping grant a in `create-catalogs-schemas` ([#2917](#2917)). In this release, the logging for skipping grants in the `_update_principal_acl` method of the `CatalogSchema` class has been improved. The code now logs a more detailed message when it cannot identify a UC grant for a specific grant object, indicating that the grant is a legacy grant that is not supported in UC, along with the grant's action type and associated object. This change provides more context for debugging and troubleshooting purposes. Additionally, the functionality of using a `DENY` grant instead of a `USAGE` grant for a specific principal and schema in the hive metastore has been introduced. The test case `test_catalog_schema_acl()` in the `test_catalog_schema.py` file has been updated to reflect this new behavior. A new test case `test_create_all_catalogs_schemas_logs_untranslatable_grant(caplog)` has also been added to verify the new logging behavior for skipping legacy grants that are not supported in UC. These changes improve the logging system and enhance the `CatalogSchema` class functionality in the open-source library.
* Verify migration progress prerequisites during UCX catalog creation ([#2912](#2912)). In this update, a new method `verify()` has been added to the `verify_progress_tracking` object in the `workspace_context` object to verify the prerequisites for UCX catalog creation. The prerequisites include the existence of a UC metastore, a UCX catalog, and a successful `assessment` job run. If the assessment job is pending or running, the code will wait up to 1 hour for it to finish before considering the prerequisites unmet. This feature includes modifications to the `create-ucx-catalog` CLI command and adds unit tests. This resolves issue [#2816](#2816) and ensures that the migration progress prerequisites are met before creating the UCX catalog. The `VerifyProgressTracking` class has been added to the `databricks.labs.ucx.progress.install` module and is used in the `Application` class. The changes include a new `timeout` argument to specify the waiting time for pending or running assessment jobs. The commit also includes several new unit tests for the `VerifyProgressTracking` class and modifications to the `test_install.py` file in the `tests/unit/progress` directory. The code has been manually tested and meets the requirements.
@nfx nfx mentioned this pull request Oct 10, 2024
nfx added a commit that referenced this pull request Oct 10, 2024
* Added `google-cloud-storage` to known list
([#2827](#2827)). In this
release, we have added the `google-cloud-storage` library, along with
its various modules and sub-modules, to our project's known list in a
JSON file. Additionally, we have included the `google-crc32c` and
`google-resumable-media` libraries. These libraries provide
functionalities such as content addressable storage, checksum
calculation, and resumable media upload and download. This change is a
partial resolution to issue
[#1931](#1931), which is
likely related to the integration or usage of these libraries in the
project. Software engineers should take note of these additions and how
they may impact the project's functionality.
* Added `google-crc32c` to known list
([#2828](#2828)). With this
commit, we have added the `google-crc32c` library to our system's known
list, addressing part of issue
[#1931](#1931). This
addition enhances the overall functionality of the system by providing
efficient and high-speed CRC32C computation when utilized. The
`google-crc32c` library is known for its performance and reliability,
and by incorporating it into our system, we aim to improve the
efficiency and robustness of the CRC32C computation process. This
enhancement is part of our ongoing efforts to optimize the system and
ensure a more efficient experience for our end-users. With this change,
users can expect faster and more reliable CRC32C computations in their
applications.
* Added `holidays` to known list
([#2906](#2906)). In this
release, we have expanded the known list in our open-source library to
include a new `holidays` category, aimed at supporting tracking of
holidays for different countries, religions, and financial institutions.
This category includes several subcategories, such as calendars,
countries, deprecation, financial holidays, groups, helpers, holiday
base, mixins, observed holiday base, registry, and utils. Each
subcategory contains an empty list, allowing for future data storage
related to holidays. This change partially resolves issue
[#1931](#1931), and
represents a significant step towards supporting a more comprehensive
range of holiday tracking needs in our library. Software engineers may
utilize this new feature to build applications that require tracking and
management of various holidays and related data.
* Added `htmlmin` to known list
([#2907](#2907)). In this
update, we have added the `htmlmin` library to the `known.json`
configuration file's list of known libraries. This addition enables the
use and management of `htmlmin` and its components, including
`htmlmin.command`, `htmlmin.decorator`, `htmlmin.escape`,
`htmlmin.main`, `htmlmin.middleware`, `htmlmin.parser`,
`htmlmin.python3html`, and `htmlmin.python3html.parser`. This change
partially addresses issue
[#1931](#1931), which may
have been caused by the integration or usage of `htmlmin`. Software
engineers can now utilize `htmlmin` and its features in their projects,
thanks to this enhancement.
* Document preparing external locations when creating catalogs
([#2915](#2915)). Databricks
Labs' UCX tool has been updated to incorporate the preparation of
external locations when creating catalogs during the upgrade to Unity
Catalog (UC). This enhancement involves the addition of new
documentation outlining how to physically separate data in storage
within UC, adhering to Databricks' best practices. The
`create-catalogs-schemas` command has been updated to create UC catalogs
and schemas based on a mapping file, allowing users to reuse previously
created external locations or establish new ones outside of UCX. For
data separation, users can leverage external locations when using
subpaths, providing flexibility in data management during the upgrade
process.
* Fixed `KeyError` from `assess_workflows` task
([#2919](#2919)). In this
release, we have made significant improvements to error handling in our
open-source library. We have fixed a KeyError in the `assess_workflows`
task and modified the `_safe_infer_internal` and
`_unsafe_infer_internal` methods to handle both `InferenceError` and
`KeyError` during inference. When an error occurs, we now log the error
message with the node and yield a `Uninferable` object. Additionally, we
have updated the `do_infer_values` method of the `_LocalInferredValue`
class to yield an iterator of iterables of `NodeNG` objects. We have
added multiple unit tests for inferring values in Python code, including
cases for handling externally defined values and their absence. These
changes ensure that our library can handle errors more gracefully and
provide more informative feedback during inference, making it more
robust and easier to use in software engineering projects.
* Fixed `OSError: [Errno 95]` bug in `assess_workflows` task by skipping
GIT-sourced workflows from static code analysis
([#2924](#2924)). In this
release, we have resolved the `OSError: [Errno 95]` bug in the
`assess_workflows` task that occurred while performing static code
analysis on GIT-sourced workflows. A new attribute `Source` has been
introduced in the `jobs` module of the `databricks.sdk.service` package
to identify the source of a notebook task. If the notebook task source
is GIT, a new `DependencyProblem` is raised, indicating that notebooks
in GIT should be analyzed using the `databricks labs ucx
lint-local-code` CLI command. The `_register_notebook` method has been
updated to check if the notebook task source is GIT and return an
appropriate `DependencyProblem` message. This change enhances the
reliability of the `assess_workflows` task by avoiding the
aforementioned bug and provides a more informative message when
notebooks are sourced from GIT. This change is part of our ongoing
effort to improve the project's quality and reliability and benefits
software engineers who adopt the project.
* Fixed absolute path normalisation in source code analysis
([#2920](#2920)). In this
release, we have addressed an issue with the Workspace API not
supporting relative subpaths such as "/a/b/../c", which has been
resolved by resolving workspace paths before calling the API. This fix
is backward compatible and ensures the correct behavior of the source
code analysis. Additionally, we have added integration tests and
co-authored this commit with Eric Vergnaud and Serge Smertin.
Furthermore, we have added a new test case that supports relative
grand-parent paths in the dependency graph construction, utilizing a new
`NotebookLoader` class. This loader is responsible for loading the
notebook content and metadata given a path, and this new test case
exercises the path resolution logic when a notebook depends on another
notebook located two levels up in the directory hierarchy. These changes
improve the robustness and reliability of the source code analysis in
the presence of relative paths.
* Fixed downloading wheel libraries from DBFS on mounted Azure Storage
fail with access denied
([#2918](#2918)). In this
release, we have introduced enhancements to the library's handling of
registering and downloading wheel libraries from DBFS on mounted Azure
Storage, addressing an issue that resulted in access denied errors. The
changes include improved error handling with the addition of a
`try-except` block to handle potential `BadRequest` exceptions and the
inclusion of three new methods to register different types of libraries.
The `_register_requirements_txt` method reads requirements files and
registers each library specified in the file, logging a warning message
for any references to other requirements or constraints files. The
`_register_whl` method creates a temporary copy of the given wheel file
in the local file system and registers it, while the `_register_egg`
method checks the runtime version and yields a `DependencyProblem` if
the version is greater than (14, 0). These changes simplify the code and
enhance error handling while addressing the reported issues related to
registering libraries. The changes are implemented in the `jobs.py` file
located in the `databricks/labs/ucx/source_code` directory, which also
includes the import of the `BadRequest` exception class from
`databricks.sdk.errors`.
* Fixed issue with migrating MANAGED hive_metastore table to UC
([#2892](#2892)). In this
release, we have implemented changes to address the issue of migrating
HMS (Hive Metastore) managed tables to UC (Unity Catalog) as EXTERNAL.
Historically, deleting a managed table also removed the underlying data,
leading to potential data loss and making the UC table unusable. The new
approach provides options to mitigate these issues, including migrating
as EXTERNAL or cloning the data to maintain integrity. These changes aim
to prevent accidental data deletion, ensure data recoverability, and
avoid inconsistencies when new data is added to either HMS or UC. We
have introduced new class attributes, methods, and parameters in
relevant modules such as `WorkspaceConfig`, `Table`, `migrate_tables`,
and `install.py`. These modifications support the new migration
strategies and allow for more flexibility in managing how tables are
migrated and how data is handled. The upgrade process can be triggered
using the `migrate-tables` UCX command or by running the table migration
workflows deployed to the workspace. Thorough testing and documentation
have been performed to minimize risks of data inconsistencies during
migration. It is crucial to understand the implications of these changes
and carefully consider the trade-offs before migrating managed tables to
UC as EXTERNAL.
* Improve creating UC catalogs
([#2898](#2898)). In this
release, the process of creating Unity Catalog (UC) catalogs has been
significantly improved with the resolution of several issues discovered
during manual testing. The `databricks labs ucx
create-ucx-catalog/create-catalogs-schemas` command has been updated to
ensure a better user experience and enhance consistency. Changes include
requesting the catalog location even if the catalog already exists,
eliminating multiple loops over storage locations, and improving logging
and matching storage locations. The code now includes new checks to
avoid requesting a catalog's storage location if it already exists and
updates the behavior of the `_create_catalog_validate` and
`_validate_location` methods. Additionally, new unit tests have been
added to verify these changes. Under the hood, a new method,
`get_catalog`, has been introduced to the `WorkspaceClient` class, and
several test functions, such as
`test_create_ucx_catalog_skips_when_ucx_catalogs_exists` and
`test_create_all_catalogs_schemas_creates_catalogs`, have been
implemented to ensure the proper functioning of the updated command.
This release addresses issue
[#2879](#2879) and enhances
the overall process of creating UC catalogs, making it more efficient
and reliable.
* Improve logging when skipping grant a in `create-catalogs-schemas`
([#2917](#2917)). In this
release, the logging for skipping grants in the `_update_principal_acl`
method of the `CatalogSchema` class has been improved. The code now logs
a more detailed message when it cannot identify a UC grant for a
specific grant object, indicating that the grant is a legacy grant that
is not supported in UC, along with the grant's action type and
associated object. This change provides more context for debugging and
troubleshooting purposes. Additionally, the functionality of using a
`DENY` grant instead of a `USAGE` grant for a specific principal and
schema in the hive metastore has been introduced. The test case
`test_catalog_schema_acl()` in the `test_catalog_schema.py` file has
been updated to reflect this new behavior. A new test case
`test_create_all_catalogs_schemas_logs_untranslatable_grant(caplog)` has
also been added to verify the new logging behavior for skipping legacy
grants that are not supported in UC. These changes improve the logging
system and enhance the `CatalogSchema` class functionality in the
open-source library.
* Verify migration progress prerequisites during UCX catalog creation
([#2912](#2912)). In this
update, a new method `verify()` has been added to the
`verify_progress_tracking` object in the `workspace_context` object to
verify the prerequisites for UCX catalog creation. The prerequisites
include the existence of a UC metastore, a UCX catalog, and a successful
`assessment` job run. If the assessment job is pending or running, the
code will wait up to 1 hour for it to finish before considering the
prerequisites unmet. This feature includes modifications to the
`create-ucx-catalog` CLI command and adds unit tests. This resolves
issue [#2816](#2816) and
ensures that the migration progress prerequisites are met before
creating the UCX catalog. The `VerifyProgressTracking` class has been
added to the `databricks.labs.ucx.progress.install` module and is used
in the `Application` class. The changes include a new `timeout` argument
to specify the waiting time for pending or running assessment jobs. The
commit also includes several new unit tests for the
`VerifyProgressTracking` class and modifications to the
`test_install.py` file in the `tests/unit/progress` directory. The code
has been manually tested and meets the requirements.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
documentation Improvements or additions to documentation
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants