- patch hosted runner (#185). In this release, we have implemented a temporary fix to address issues with publishing artifacts in the release workflow. This fix involves changing the runner used for the job from
ubuntu-latest
to a protected runner group labeled "linux-ubuntu-latest". This ensures that the job runs on a designated hosted runner with the specified configuration, enhancing the reliability and security of the release process. Thepermissions
section of the job remains unchanged, allowing authentication to PyPI and signing of release artifacts with sigstore-python. It is worth noting that this is a stopgap measure, and further changes to the release workflow may be made in the future.
- Fixed incorrect script for no-pylint-disable (#178). In this release, we have updated the script used in the
no-cheat
GitHub workflow to address false positives in stacked pull requests. The updated script fetches the base reference from the remote repository and generates a diff between the base reference and the current branch, saving it to a file. It then runs the "no_cheat.py" script against this diff file and saves the results to a separate file. If the count of cheats (instances where linting has been intentionally disabled) is greater than one, the script outputs the contents of the results file and exits with a non-zero status, indicating an error. This change enhances the accuracy of the script and ensures it functions correctly in a stacked pull request scenario. Theno_cheat
function, which checks for the presence of certain pylint disable tags in a given diff text, has been updated to the latest version from the ucx project to improve accuracy. The function identifies tags by looking for lines starting with-
or "+" followed by the disable tag and a list of codes, and counts the number of times each code is added and removed, reporting any net additions. - Skip dataclassess fields only when
None
(#180). In this release, we have implemented a change that allows for the skipping of dataclass fields only when the value isNone
, enabling the inclusion of empty lists, strings, or zeros during marshalling. This modification is in response to issue #179 and involves adding a check forNone
before marshalling a dataclass field. Specifically, the previous conditionif not raw:
has been replaced withif raw is None:
. This change ensures that empty values such as[]
,''
, or0
are not skipped during the serialization process, unless they are explicitly set toNone
. This enhancement provides improved compatibility and flexibility for users working with dataclasses containing empty values, allowing for more fine-grained control during the serialization process.
Dependency updates:
- Bump codecov/codecov-action from 4 to 5 (#174).
- Fixed issue when Databricks SDK config objects were overridden for installation config files (#170). This commit addresses an issue where Databricks SDK config objects were being overridden during installation config files creation, which has been resolved by modifying the
_marshal
method in theinstallation
class to handledatabricks.sdk.core.Config
instances more carefully, and by introducing a new helper functionget_databricks_sdk_config
in thepaths.py
file, which retrieves the Databricks SDK configuration and improves the reliability and robustness of the SDK configuration. This fixes bug #169 and ensures that the SDK configuration is not accidentally modified during the installation process, preventing unexpected behavior and errors. The changes are isolated to thepaths.py
file and do not affect other parts of the codebase.
- Bump actions/checkout from 4.2.1 to 4.2.2 (#160). In this release, the 'actions/checkout' dependency has been updated from version 4.2.1 to 4.2.2. This update includes changes to the 'url-helper.ts' file, which now utilizes well-known environment variables for improved reliability and maintainability. Additionally, unit test coverage for the
isGhes
function has been expanded. These changes are recommended for adoption to take advantage of the enhancements. The pull request includes a detailed changelog, commit history, and instructions for managing the update using Dependabot commands and options. - Bump databrickslabs/sandbox from acceptance/v0.3.1 to 0.4.2 (#166). In the latest release, the
databrickslabs/sandbox
Python package has been updated from version acceptance/v0.3.1 to 0.4.2. This update includes new features such as installation instructions, additional go-git libraries, and modifications to the README file. Dependency updates include a bump in the version ofgolang.org/x/crypto
used. The pull request for this update was created by a GitHub bot, Dependabot, which will manage any conflicts and respond to comments containing specific commands. It is essential to thoroughly review and test this updated version to ensure that the new methods and modifications to existing functionality do not introduce any issues or regressions, and that the changes are well-documented and justified. - Don't draft automated releases (#159). In this release, the draft release feature in the GitHub Actions workflow has been disabled, enhancing the release process for software engineers. The 'draft: true' parameter has been removed from the
Draft release
job, which means that automated releases will now be published immediately upon creation instead of being created as drafts. This modification simplifies and streamlines the release process, making it more efficient for engineers who adopt the project. The change is aimed at reducing the time and effort required in manually publishing draft releases, thereby improving the overall experience for project contributors and users. - Updated custom
Path
support for python 3.13 (#161). In this revision, the project's continuous integration (CI) workflow has been updated to include Python 3.13, enhancing compatibility and enabling early identification of platform-specific issues. Thepaths
module has been refactored into several submodules for better organization, and a new submodule,databrickspath_posixpath
, has been added to distinguishPosixPath
fromDBFSPath
andWorkspacePath
. The comparison and equality behavior of_DatabricksPath
objects has been modified to includeparser
property identity checks in Python 3.13, ensuring consistent behavior and eliminating exceptions when built-in paths are compared with custom paths. These updates promote confidence in the project's long-term viability and adaptability in response to evolving language standards.
Dependency updates:
- Bump actions/checkout from 4.2.1 to 4.2.2 (#160).
- Bump databrickslabs/sandbox from acceptance/v0.3.1 to 0.4.2 (#166).
- Bump actions/checkout from 4.1.7 to 4.2.0 (#149). In this pull request, the
actions/checkout
dependency is upgraded from version 4.1.7 to 4.2.0 in theacceptance.yml
anddownstreams.yml
workflow files. The new version provides additional Ref and Commit outputs, as well as updated dependencies, which aim to improve the functionality and security of the checkout process. TheRef
output is a string representing the reference that was checked out, and theCommit
output is the SHA-1 hash of the checked-out commit. Dependency updates include bumping thebraces
package from 3.0.2 to 3.0.3 and updating the minor-npm-dependencies group across one directory with four updates. These changes contribute to a more reliable and efficient checkout process and enhance the overall functionality and maintainability of the Action. Software engineers are recommended to review the changes and ensure they do not introduce conflicts with their current setup before adopting the new version. - Bump actions/checkout from 4.2.0 to 4.2.1 (#152). In this update, the version of the
actions/checkout
GitHub Action is bumped from 4.2.0 to 4.2.1 in a project's GitHub workflow files. This new version includes a modification to check out otherrefs/*
by commit if provided, falling back to the ref. This change enhances the flexibility of thecheckout
action in handling different types of references, which could be useful for users working with multiple branches or references in their workflows. The update also adds a workflow file for publishing releases to an immutable action package. This release was contributed by the new project collaborator, @orhantoy, who made the change in pull request 1924. - Bump databrickslabs/sandbox from acceptance/v0.3.0 to 0.3.1 (#155). In this update, the dependency for
databrickslabs/sandbox
has been bumped from versionacceptance/v0.3.0
to0.3.1
. This change includes bug fixes, upgrades to go-git libraries, and dependency updates. Thegolang.org/x/crypto
library was specifically bumped from version0.16.0
to0.17.0
in both/go-libs
and/runtime-packages
. Additionally, thecac167b
commit expanded acceptance test logs and introduced experimental OIDC refresh token rotation. The acceptance test job in the workflow was also updated to use the new version ofdatabrickslabs/sandbox
. Ignore conditions were added for previous versions ofdatabrickslabs/sandbox
in this release. The README was also modified, and install instructions were added to the changelog. - Catch all errors when checking Databricks path, notably BadRequest ones (#156). This commit introduces improvements to the error handling of the
exists
method in thepaths.py
file when checking Databricks path. Previously, onlyNotFound
errors were caught, but nowBadRequest
errors are also handled, addressing issue #2882. Theexists
method has been updated to catch and manageDatabricksError
exceptions, which now encompassBadRequest
errors, ensuring comprehensive error handling for Databricks path-related operations. Additionally, the_cached_file_info
and_cached_object_info
attributes are now initialized when aDatabricksError
exception occurs, returningFalse
accordingly. This enhancement maintains consistency and accuracy in theexists
method while broadening the range of errors captured, resulting in a more robust and reliable codebase with enhanced error reporting for users. - Normalize databricks paths as part of resolving them (#157). In this release, the
resolve
method in thepaths.py
file of the databricks/labs/blueprint project has been enhanced to handle parent directory references ("..") consistently with Python's built-inPath
object. Previously,Path("/a/b/../c").resolve()
would returnPath("/a/b/c")
, while Databricks paths were not behaving consistently. This modification introduces a new_normalize()
method, which processes the path parts and ensures that ".." segments are handled correctly. The commit also includes a new test function, 'test_resolve_is_consistent', which checks the consistent resolution of Databricks paths with various input formats, such as relative paths, ".." or "." components, and absolute paths. This change ensures that the resolved path will be normalized according to the expected behavior, regardless of the input format, contributing to the resolution of issue #2882. By normalizing Databricks paths in the same fashion as Python's built-inPath
object, the code should become more robust and predictable, providing a more reliable and predictable experience for software engineers utilizing the project. - Updated databrickslabs/sandbox requirement to acceptance/v0.3.0 (#153). In this pull request, the
databrickslabs/sandbox
package requirement in the downstreams GitHub Actions workflow is updated to version 0.3.0, which is the latest version available. This package provides a sandbox environment for development and testing, and the new version includes bug fixes and dependency updates that may enhance its reliability and performance. Dependabot has been used to ensure a smooth update process, with any conflicts being resolved automatically. However, it is recommended to review the changelog and test the updated version before merging this pull request to ensure compatibility and functionality in your specific use case. Additionally, Dependabot commands are available to manage ignore conditions for this dependency.
Dependency updates:
- Bump actions/checkout from 4.1.7 to 4.2.0 (#149).
- Bump actions/checkout from 4.2.0 to 4.2.1 (#152).
- Updated databrickslabs/sandbox requirement to acceptance/v0.3.0 (#153).
- Bump databrickslabs/sandbox from acceptance/v0.3.0 to 0.3.1 (#155).
- Added Databricks CLI version as part of routed command telemetry (#147). A new environment variable, "DATABRICKS_CLI_VERSION", has been introduced in the Databricks CLI version for routed command telemetry. This variable is incorporated into the
with_user_agent_extra
method, which adds it to the user agent for outgoing requests, thereby enhancing detailed tracking and version identification in telemetry data. Thewith_user_agent_extra
method is invoked twice, with theblueprint
prefix and the version variable, followed by thecli
prefix and the DATABRICKS_CLI_VERSION environment variable, ensuring that both the blueprint and CLI versions are transmitted in the user agent for all requests.
- add missing stat() methods to DBFSPath and WorkspacePath (#144). The
stat()
method has been added to bothDBFSPath
andWorkspacePath
classes, addressing issues #142 and #143. This method, which adheres to the Posix standard, returns file status in theos.stat_result
format, providing access to various metadata attributes such as file size, last modification time, and creation time. By incorporating this method, developers can now obtain essential file information for Databricks File System (DBFS) and Databricks Workspace paths when working with these classes. The change includes a new test case forstat()
in thetest_paths.py
file to ensure the correctness of the method for both classes.
- Make hatch a prerequisite (#137). In version 1.9.4, hatch has become a prerequisite for installation in the GitHub workflow for the project's main branch, due to occasional failures in
pip install hatch
that depend on the local environment. This change, which includes defining the hatch version as an environment variable and adding a new step for installing hatch with a specific version, aims to enhance the reliability of the build and testing process by eliminating potential installation issues with hatch. Users should install hatch manually before executing the Makefile, as the linepip install hatch
has been removed from the Makefile. This change aligns with the approach taken for ucx, and users are expected to understand the requirement to install prerequisites before executing the Makefile. To contribute to this project, please install hatch usingpip install hatch
, clone the GitHub repository, and runmake dev
to start the development environment and install necessary dependencies. - support files with unicode BOM (#138). The recent change to the open-source library introduces support for handling files with a Unicode Byte Order Mark (BOM) during file upload and download operations in Databricks Workspace. This new functionality, added to the
WorkspacePath
class, allows for easier reading of text from files with the addition of aread_text
method. When downloading a file, if it starts with a BOM, it will be detected and used for decoding, regardless of the preferred encoding based on the system's locale. The change includes a new test function that verifies the accurate encoding and decoding of files with different types of BOM using the appropriate encoding. Despite the inability to test Databrick notebooks with a BOM due to the Databricks platform modifying the uploaded data, this change enhances support for handling files with various encodings and BOM, improving compatibility with a broader range of file formats, and ensuring more accurate handling of files with BOM.
- Fixed py3.10 compatibility for
_parts
in pathlike (#135). The recent update to our open-source library addresses the compatibility issue with Python 3.10 in the_parts
property of a certain type. Prior to this change, there was also a_cparts
property that returned the same value as_parts
, which has been removed and replaced with a direct reference to_parts
. The_parts
property can now be accessed via reverse equality comparison, and this change has been implemented in thejoinpath
and__truediv__
methods as well. This enhancement improves the library's compatibility with Python 3.10 and beyond, ensuring continued functionality and stability for software engineers working with the latest Python versions.
- Added
DBFSPath
asos.PathLike
implementation (#131). The open-source library has been updated with a new classDBFSPath
, an implementation ofos.PathLike
for Databricks File System (DBFS) paths. This new class extends the existingWorkspacePath
support and provides pathlib-like functionality for DBFS paths, including methods for creating directories, renaming and deleting files and directories, and reading and writing files. The addition ofDBFSPath
includes type-hinting for improved code linting and is integrated in the test suite with new and updated tests for path-like objects. The behavior of theexists
andunlink
methods have been updated forWorkspacePath
to improve performance and raise appropriate errors. - Fixed
.as_uri()
and.absolute()
implementations forWorkspacePath
(#127). In this release, theWorkspacePath
class in thepaths.py
module has been updated with several improvements to the.as_uri()
and.absolute()
methods. These methods now utilize PathLib internals, providing better cross-version compatibility. The.as_uri()
method now uses an f-string for concatenation and returns the UTF-8 encoded string representation of theWorkspacePath
object via a new__bytes__()
dunder method. Additionally, the.absolute()
method has been implemented for the trivial (no-op) case and now supports returning the absolute path of files or directories in Databricks Workspace. Furthermore, theglob()
andrglob()
methods have been enhanced to support case-sensitive pattern matching based on a newcase_sensitive
parameter. To ensure the integrity of these changes, two new test cases,test_as_uri()
andtest_absolute()
, have been added, thoroughly testing the functionality of these methods. - Fixed
WorkspacePath
support for python 3.11 (#121). TheWorkspacePath
class in our open-source library has been updated to improve compatibility with Python 3.11. The.expanduser()
and.glob()
methods have been modified to address internal changes in Python 3.11. Theis_dir()
andis_file()
methods now include afollow_symlinks
parameter, although it is not currently used. A new method,_scandir()
, has been added for compatibility with Python 3.11. Theexpanduser()
method has also been updated to expand~
(but not~user
) constructs. Additionally, a new methodis_notebook()
has been introduced to check if the path points to a notebook in Databricks Workspace. These changes aim to ensure that the library functions smoothly with the latest version of Python and provides additional functionality for users working with Databricks Workspace. - Properly verify versions of python (#118). In this release, we have made significant updates to the pyproject.toml file to enhance project dependency and development environment management. We have added several new packages to the
dependencies
section to expand the library's functionality and compatibility. Additionally, we have removed thepython
field, as it is no longer necessary. We have also updated thepath
field to specify the location of the virtual environment, which can improve integration with popular development tools such as Visual Studio Code and PyCharm. These changes are intended to streamline the development process and make it easier to manage dependencies and set up the development environment. - Type annotations on path-related unit tests (#128). In this open-source library update, type annotations have been added to path-related unit tests to enhance code clarity and maintainability. The tests encompass various scenarios, including verifying if a path exists, creating, removing, and checking directories, and testing file attributes such as distinguishing directories, notebooks, and regular files. The additions also cover functionality for opening and manipulating files in different modes like read binary, write binary, read text, and write text. Furthermore, tests for checking file permissions, handling errors, and globbing (pattern-based file path matching) have been incorporated. The tests interact with a WorkspaceClient mock object, simulating file system interactions. This enhancement bolsters the library's reliability and assists developers in creating robust, well-documented code when working with file system paths.
- Updated
WorkspacePath
to support Python 3.12 (#122). In this release, theWorkspacePath
implementation has been updated to ensure compatibility with Python 3.12, in addition to Python 3.10 and 3.11. The class was modified to replace most of the internal implementation and add extensive tests for public interfaces, ensuring that the superclass implementations are not used unless they are known to be safe. This change is in response to the significant changes in the superclass implementations between Python 3.11 and 3.12, which were found to be incompatible with each other. TheWorkspacePath
class now includes several new methods and tests to ensure that it functions seamlessly with different versions of Python. These changes include testing for initialization, equality, hash, comparison, path components, and various path manipulations. This update enhances the library's adaptability and ensures it functions correctly with different versions of Python. Classifiers have also been updated to include support for Python 3.12. WorkspacePath
fixes for the.resolve()
implementation (#129). The.resolve()
method forWorkspacePath
has been updated to improve its handling of relative paths and thestrict
argument. Previously, relative paths were not properly validated and would be returned as-is. Now, relative paths will cause the method to fail. Thestrict
argument is now checked, and if set toTrue
and the path does not exist, aFileNotFoundError
will be raised. The method.absolute()
is used to obtain the absolute path of the file or directory in Databricks Workspace and is used in the implementation of.resolve()
. A new test,test_resolve()
, has been added to verify these changes, covering scenarios where the path is absolute, the path exists, the path does not exist, and the path is relative. In the case of relative paths, aNotImplementedError
is raised, as.resolve()
is not supported for them.WorkspacePath
: Fix the .rename() and .replace() implementations to return the target path (#130). The.rename()
and.replace()
methods of theWorkspacePath
class have been updated to return the target path as part of the public API, with.rename()
no longer accepting theoverwrite
keyword argument and always failing if the target path already exists. A new private method,._rename()
, has been added to include theoverwrite
argument and is used by both.rename()
and.replace()
. This update is a preparatory step for factoring out common code to support DBFS paths. The tests have been updated accordingly, combining and adding functions to test the new and updated methods. The.unlink()
method's behavior remains unchanged. Please note that the exact error raised when.rename()
fails due to an existing target path is yet to be defined.
Dependency updates:
- Bump sigstore/gh-action-sigstore-python from 2.1.1 to 3.0.0 (#133).
- Added
databricks.labs.blueprint.paths.WorkspacePath
aspathlib.Path
equivalent (#115). This commit introduces thedatabricks.labs.blueprint.paths.WorkspacePath
library, providing Python-nativepathlib.Path
-like interfaces to simplify working with Databricks Workspace paths. The library includesWorkspacePath
andWorkspacePathDuringTest
classes offering advanced functionality for handling user home folders, relative file paths, browser URLs, and file manipulation methods such asread/write_text()
,read/write_bytes()
, andglob()
. This addition brings enhanced, Pythonic ways to interact with Databricks Workspace paths, including creating and moving files, managing directories, and generating browser-accessible URIs. Additionally, the commit includes updates to existing methods and introduces new fixtures for creating notebooks, accompanied by extensive unit tests to ensure reliability and functionality. - Added propagation of
blueprint
version intoUser-Agent
header when it is used as library (#114). A new feature has been introduced in the library that allows for the propagation of theblueprint
version and the name of the command line interface (CLI) command used in theUser-Agent
header when the library is utilized as a library. This feature includes the addition of two new pairs ofOtherInfo
:blueprint/X.Y.Z
to indicate that the request is made using theblueprint
library andcmd/<name>
to store the name of the CLI command used for making the request. The implementation involves using thewith_user_agent_extra
function fromdatabricks.sdk.config
to set the user agent consistently with the Databricks CLI. Several changes have been made to the test file fortest_useragent.py
to include a new test case,test_user_agent_is_propagated
, which checks if theblueprint
version and the name of the command are correctly propagated to theUser-Agent
header. A context managerhttp_fixture_server
has been added that creates an HTTP server with a custom handler, which extracts theblueprint
version and the command name from theUser-Agent
header and stores them in theuser_agent
dictionary. The test case calls thefoo
command with a mockedWorkspaceClient
instance and sets theDATABRICKS_HOST
andDATABRICKS_TOKEN
environment variables to test the propagation of theblueprint
version and the command name in theUser-Agent
header. The test case then asserts that theblueprint
version and the name of the command are present and correctly set in theuser_agent
dictionary. - Bump actions/checkout from 4.1.6 to 4.1.7 (#112). In this release, the version of the "actions/checkout" action used in the
Checkout Code
step of the acceptance workflow has been updated from 4.1.6 to 4.1.7. This update may include bug fixes, performance improvements, and new features, although specific changes are not mentioned in the commit message. TheUnshallow
step remains unchanged, continuing to fetch and clean up the repository's history. This update ensures that the latest enhancements from the "actions/checkout" action are utilized, aiming to improve the reliability and performance of the code checkout process in the GitHub Actions workflow. Software engineers should be aware of this update and its potential impact on their workflows.
Dependency updates:
- Bump actions/checkout from 4.1.6 to 4.1.7 (#112).
- fixed
Command.get_argument_type
bug withUnionType
(#110). In this release, theCommand.get_argument_type
method has been updated to include special handling forUnionType
, resolving a bug that caused the function to crash when encountering this type. The method now returns the string representation of the annotation if the argument is aUnionType
, providing more accurate and reliable results. To facilitate this, modifications were made using thetypes
module. Additionally, thefoo
function has a new optional argumentoptional_arg
of typestr
, with a default value ofNone
. This argument is passed to thesome
function in the assertion. ThePrompts
type has been added to thefoo
function signature, and an assertion has been added to verify ifprompts
is an instance ofPrompts
. Lastly, the default value of theaddress
argument has been changed from an empty string to "default", and the same changes have been applied to thetest_injects_prompts
test function.
- Applied type casting & remove empty kwarg for Command (#108). A new method,
get_argument_type
, has been added to theCommand
class in thecli.py
file to determine the type of a given argument name based on the function's signature. The_route
method has been updated to remove any empty keyword arguments from thekwargs
dictionary, and apply type casting based on the argument type using theget_argument_type
method. This ensures that thekwargs
passed intoApp.command
are correctly typed and eliminates any empty keyword arguments, which were previously passed as empty strings. In the test file for the command-line interface, thefoo
command's keyword arguments have been updated to includeage
(int),salary
(float),is_customer
(bool), andaddress
(str) types, with thename
argument remaining and a default value foraddress
. Thetest_commands
andtest_injects_prompts
functions have been updated accordingly. These changes aim to improve the input validation and type safety of theApp.command
method.
- Made
ProductInfo.version
acached_property
to avoid failure when comparing wheel uploads in development (#105). In this release, theapply
method of a class has been updated to sort upgrade scripts in semantic versioning order before applying them, addressing potential issues with version comparison during development. The implementation ofProductInfo.version
has been refactored to acached_property
called_version
, which calculates and caches the project version, addressing a failure during wheel upload comparisons in development. TheWheels
class constructor has also been updated to include explicit keyword-only arguments, and a deprecation warning has been added. These changes aim to improve the reliability and predictability of the upgrade process and the library as a whole.
Dependency updates:
- Bump actions/checkout from 4.1.5 to 4.1.6 (#106).
- Added upstream wheel uploads for Databricks Workspaces without Public Internet access (#99). This commit introduces a new feature for uploading upstream wheel dependencies to Databricks Workspaces without Public Internet access. A new flag has been added to upload functions, allowing users to include or exclude dependencies in the download list. The
WheelsV2
class has been updated with a new method,upload_wheel_dependencies(prefixes)
, which checks if each wheel's name starts with any of the provided prefixes before uploading it to the Workspace File System (WSFS). This feature also includes two new tests to verify the functionality of uploading the main wheel package and dependent wheel packages, optimizing downloads based on specific use cases. This enables users to more easily use the package in offline environments with restricted internet access, particularly for Databricks Workspaces with extra layers of network security. - Fixed bug for double-uploading of unreleased wheels in air-gapped setups (#103). In this release, we have addressed a bug in the
upload_wheel_dependencies
method of theWheelsV2
class, which caused double-uploading of unreleased wheels in air-gapped setups. This issue occurred due to the conditionif wheel.name == self._local_wheel.name
not being met, resulting in undefined behavior. We have introduced a cached property_current_version
to tackle this bug for unreleased versions uploaded to air-gapped workspaces. We also added a new method,upload_to_wsfs()
, that uploads files to the workspace file system (WSFS) in the integration test. This release also includes new tests to ensure that only the Databricks SDK is uploaded and that the number of installation files is correct. These changes have resolved the double-uploading issue, and the number of installation files, Databricks SDK, Blueprint, and version.json metadata are now uploaded correctly to WSFS.
- Added content assertion for
assert_file_uploaded
andassert_file_dbfs_uploaded
inMockInstallation
(#101). The recent commit introduces a content assertion feature to theMockInstallation
class, enhancing its testing capabilities. This is achieved by adding an optionalexpected
parameter of typebytes
to theassert_file_uploaded
andassert_file_dbfs_uploaded
methods, allowing users to verify the uploaded content's correctness. The_assert_upload
method has also been updated to accept this new parameter, ensuring the actual uploaded content matches the expected content. Furthermore, the commit includes informative docstrings for the new and updated methods, providing clear explanations of their functionality and usage. To support these improvements, new test casestest_assert_file_uploaded
andtest_load_empty_data_class
have been added to thetests/unit/test_installation.py
file, enabling more rigorous testing of theMockInstallation
class and ensuring that the expected content is uploaded correctly. - Added handling for partial functions in
parallel.Threads
(#93). In this release, we have enhanced theparallel.Threads
module with the ability to handle partial functions, addressing issue #93. This improvement includes the addition of a new static method,_get_result_function_signature
, to obtain the signature of a function or a string representation of its arguments and keywords if it is a partial function. The_wrap_result
class method has also been updated to log an error message with the function's signature if an exception occurs. Furthermore, we have added a new test case,test_odd_partial_failed
, to the unit tests, ensuring that thegather
function handles partial functions that raise errors correctly. The Python version required for this project remains at 3.10, and thepyproject.toml
file has been updated to include "isort", "mypy", "types-PyYAML", andtypes-requests
in the list of dependencies. These adjustments are aimed at improving the functionality and type checking in theparallel.Threads
module. - Align configurations with UCX project (#96). This commit brings project configurations in line with the UCX project through various fixes and updates, enhancing compatibility and streamlining collaboration. It addresses pylint configuration warnings, adjusts GitHub Actions workflows, and refines the
pyproject.toml
file. Additionally, theNiceFormatter
class inlogger.py
has been improved for better code readability, and the versioning scheme has been updated to ensure SemVer and PEP440 compliance, making it easier to manage and understand the project's versioning. Developers adopting the project will benefit from these alignments, as they promote adherence to the project's standards and up-to-date best practices. - Check backwards compatibility with UCX, Remorph, and LSQL (#84). This release includes an update to the dependabot configuration to check for daily updates in both the pip and github-actions package ecosystems, with a new directory parameter added for the pip ecosystem for more precise update management. Additionally, a new GitHub Actions workflow, "downstreams", has been added to ensure backwards compatibility with UCX, Remorph, and LSQL by running automated downstream checks on pull requests, merge groups, and pushes to the main branch. The workflow has appropriate permissions for writing id-tokens, reading contents, and writing pull-requests, and runs the downstreams action from the databrickslabs/sandbox repository using GITHUB_TOKEN for authentication. These changes improve the security and maintainability of the project by ensuring compatibility with downstream projects and staying up-to-date with the latest package versions, reducing the risk of potential security vulnerabilities and bugs.
Dependency updates:
- Bump actions/setup-python from 4 to 5 (#89).
- Bump softprops/action-gh-release from 1 to 2 (#87).
- Bump actions/checkout from 2.5.0 to 4.1.2 (#88).
- Bump codecov/codecov-action from 1 to 4 (#85).
- Bump actions/checkout from 4.1.2 to 4.1.3 (#95).
- Bump actions/checkout from 4.1.3 to 4.1.5 (#100).
- If
Threads.strict()
raises just one error, don't wrap it withManyError
(#79). Thestrict
method in thegather
function of theparallel.py
module in thedatabricks/labs/blueprint
package has been updated to change the way it handles errors. Previously, if any task in thetasks
sequence failed, thestrict
method would raise aManyError
exception containing all the errors. With this change, if only one error occurs, that error will be raised directly without being wrapped in aManyError
exception. This simplifies error handling and avoids unnecessary nesting of exceptions. Additionally, the__tracebackhide__
dunder variable has been added to the method to improve the readability of tracebacks by hiding it from the user. This update aims to provide a more streamlined and user-friendly experience for handling errors in parallel processing tasks.
- Fixed marshalling & unmarshalling edge cases (#76). The serialization and deserialization methods in the code have been updated to improve handling of edge cases during marshalling and unmarshalling of data. When encountering certain edge cases, the
_marshal_list
method will now return an empty list instead of None, and both the_unmarshal
and_unmarshal_dict
methods will return None as is if the input is None. Additionally, the_unmarshal
method has been updated to call_unmarshal_generic
instead of checking if the type reference is a dictionary or list when it is a generic alias. The_unmarshal_generic
method has also been updated to handle cases where the input is None. A new test case,test_load_empty_data_class()
, has been added to thetests/unit/test_installation.py
file to verify this behavior, ensuring that the correct behavior is maintained when encountering these edge cases during the marshalling and unmarshalling processes. These changes increase the reliability of the serialization and deserialization processes.
- Fixed edge cases when loading typing.Dict, typing.List and typing.ClassVar (#74). In this release, we have implemented changes to improve the handling of edge cases related to the Python
typing.Dict
,typing.List
, andtyping.ClassVar
during serialization and deserialization of dataclasses and generic types. Specifically, we have modified the_marshal
and_unmarshal
functions to check for the__origin__
attribute to determine whether the type is aClassVar
and skip it if it is. The_marshal_dataclass
and_unmarshal_dataclass
functions now check for the__dataclass_fields__
attribute to ensure that only dataclass fields are marshaled and unmarshaled. We have also added a new unit test for loading a complex data class using theMockInstallation
class, which contains various attributes such as a string, a nested dictionary, a list ofPolicy
objects, and a dictionary mapping string keys toPolicy
objects. This test case checks that the installation object correctly serializes and deserializes theComplexClass
instance to and from JSON format according to the specified attribute types, including handling of thetyping.Dict
,typing.List
, andtyping.ClassVar
types. These changes improve the reliability and robustness of our library in handling complex data types defined in thetyping
module. MockPrompts.extend()
now returns a copy (#72). In the latest release, theextend()
method in theMockPrompts
class of thetui.py
module has been enhanced. Previously,extend()
would modify the originalMockPrompts
object, which could lead to issues when reusing the same object in multiple places within the same test, as its state would be altered each timeextend()
was called. This has been addressed by updating theextend()
method to return a copy of theMockPrompts
object with the updated patterns and answers, instead of modifying the original object. This change ensures that the originalMockPrompts
object can be securely reused in multiple test scenarios without unintended side effects, preserving the integrity of the original state. Furthermore, additional tests have been incorporated to verify the correct behavior of both the new and original prompts.
- Fixed
MockInstallation
to emulate workspace-global setup (#69). In this release, theMockInstallation
class in theinstallation
module has been updated to better replicate a workspace-global setup, enhancing testing and development accuracy. Theis_global
method now utilizes theproduct
method instead of_product
, and a new instance variable_is_global
with a default value ofTrue
is introduced in the__init__
method. Moreover, a newproduct
method is included, which consistently returns the string "mock". These enhancements resolve issue #69, "FixedMockInstallation
to emulate workspace-global setup", ensuring theMockInstallation
instance behaves as a global installation, facilitating precise and reliable testing and development for our software engineering team. - Improved
MockPrompts
withextend()
method (#68). In this release, we've added anextend()
method to theMockPrompts
class in our library's TUI module. This new method allows developers to add new patterns and corresponding answers to the existing list of questions and answers in aMockPrompts
object. The added patterns are compiled as regular expressions and the questions and answers list is sorted by the length of the regular expression patterns in descending order. This feature is particularly useful for writing tests where prompt answers need to be changed, as it enables better control and customization of prompt responses during testing. By extending the list of questions and answers, you can handle additional prompts without modifying the existing ones, resulting in more organized and maintainable test code. If a prompt hasn't been mocked, attempting to ask a question with it will raise aValueError
with an appropriate error message. - Use Hatch v1.9.4 to as build machine requirement (#70). The Hatch package version for the build machine requirement has been updated from 1.7.0 to 1.9.4 in this change. This update streamlines the Hatch setup and version management, removing the specific installation step and listing
hatch
directly in the required field. The pre-setup command now only includes "hatch env create". Additionally, the acceptance tool version has been updated to ensure consistent project building and testing with the specified Hatch version. This change is implemented in the acceptance workflow file and the version of the acceptance tool used by the sandbox. This update ensures that the project can utilize the latest features and bug fixes available in Hatch 1.9.4, improving the reliability and efficiency of the build process. This change is part of the resolution of issue #70.
- Added commands with interactive prompts (#66). This commit introduces a new feature in the Databricks Labs project to support interactive prompts in the command-line interface (CLI) for enhanced user interactivity. The
Prompts
argument, imported fromdatabricks.labs.blueprint.tui
, is now integrated into the@app.command
decorator, enabling the creation of commands with user interaction like confirmation prompts. An example of this is theme
command, which confirms whether the user wants to proceed before displaying the current username. The commit also refactored the code to make it more efficient and maintainable, removing redundancy in creating client instances. TheAccountClient
andWorkspaceClient
instances can now be provided automatically with the product name and version. These changes improve the CLI by making it more interactive, user-friendly, and adaptable to various use cases while also optimizing the codebase for better efficiency and maintainability. - Added more code documentation (#64). This release introduces new features and updates to various files in the open-source library. The
cli.py
file in thesrc/databricks/labs/blueprint
directory has been updated with a new decorator,command
, which registers a function as a command. Theentrypoint.py
file in thedatabricks.labs.blueprint
module now includes a module-level docstring describing its purpose, as well as documentation for the various standard libraries it imports. TheInstallation
class in theinstallers.py
file has new methods for handling files, such asload
,load_or_default
,upload
,load_local
, andfiles
. Theinstallers.py
file also includes a newInstallationState
dataclass, which is used to track installations. Thelimiter.py
file now includes code documentation for theRateLimiter
class and therate_limited
decorator, which are used to limit the rate of requests. Thelogger.py
file includes a newNiceFormatter
class, which provides a nicer format for logging messages with colors and bold text if the console supports it. Theparallel.py
file has been updated with new methods for running tasks in parallel and returning results and errors. TheTUI.py
file has been documented, and includes imports for logging, regular expressions, and collections abstract base class. Lastly, theupgrades.py
file has been updated with additional code documentation and new methods for loading and applying upgrade scripts. Overall, these changes improve the functionality, maintainability, and usability of the open-source library. - Fixed init-project command (#65). In this release, the
init-project
command has been improved with several bug fixes and new functionalities. A new import statement for thesys
module has been added, and adocs
directory is now included in the copied directories and files during initialization. Theinit_project
function has been updated to open files using the default system encoding, ensuring proper reading and writing of file contents. Therelative_paths
function in theentrypoint.py
file now returns absolute paths if the common path is the root directory, addressing issue #41. Additionally, several test functions have been added totests/unit/test_entrypoint.py
, enhancing the reliability and robustness of theinit-project
command by providing comprehensive tests for supporting functions. Overall, these changes significantly improve the functionality and reliability of theinit-project
command, ensuring a more consistent and accurate project initialization process. - Using
ProductInfo
with integration tests (#63). In this update, theProductInfo
class has been enhanced with a new class methodfor_testing(klass)
to facilitate effective integration testing. This method generates a newProductInfo
object with a randomproduct_name
, enabling the creation of distinct installation directories for each test execution. Prior to this change, conflicts and issues could arise when multiple test executions shared the same integration test folder. With the introduction of this new method, developers can now ensure that their integration tests run with unique product names and separate installation directories, enhancing testing isolation and accuracy. This update is demonstrated in the provided code snippet and includes a new test case to confirm the generation of unique product names. Furthermore, a pre-existing test case has been modified to provide a more specific error message related to theSingleSourceVersionError
. This enhancement aims to improve the integration testing capabilities of the codebase and is designed to be easily adopted by other software engineers utilizing this project.
- Fixed the order of marshal to handle Dataclass with as_dict before other types to avoid SerdeError (#60). In this release, we have addressed an issue that caused a SerdeError during the installation.save operation with a Dataclass object. The error was due to the order of evaluation in the _marshal_dataclass method. The order has been updated to evaluate the
as_dict
method first if it exists in the Dataclass, which resolves the SerdeError. To ensure the correctness of the fix, we have added a new test_data_class function that tests the save and load functionality with a Dataclass object. The test defines a Policy Dataclass with anas_dict
method that returns a dictionary representation of the object and checks if the file is written correctly and if the loaded object matches the original object. This change has been thoroughly unit tested to ensure that it works as expected.
- Added automated upgrade framework (#50). This update introduces an automated upgrade framework for managing and applying upgrades to the product, with a new
upgrades.py
file that includes aProductInfo
class having methods for version handling, wheel building, and exception handling. The test code organization has been improved, and new test cases, functions, and a directory structure for fixtures and unit tests have been added for the upgrades functionality. Thetest_wheels.py
file now checks the version of the Databricks SDK and handles cases where the version marker is missing or does not contain the__version__
variable. Additionally, a newApplication State Migrations
section has been added to the README, explaining the process of seamless upgrades from version X to version Z through version Y, addressing the need for configuration or database state migrations as the application evolves. Users can apply these upgrades by following an idiomatic usage pattern involving several classes and functions. Furthermore, improvements have been made to the_trim_leading_whitespace
function in thecommands.py
file of thedatabricks.labs.blueprint
module, ensuring accurate and consistent removal of leading whitespace for each line in the command string, leading to better overall functionality and maintainability. - Added brute-forcing
SerdeError
withas_dict()
andfrom_dict()
(#58). This commit introduces a brute-forcing approach for handlingSerdeError
usingas_dict()
andfrom_dict()
methods in an open-source library. The newSomePolicy
class demonstrates the usage of these methods for manual serialization and deserialization of custom classes. Theas_dict()
method returns a dictionary representation of the class instance, and thefrom_dict()
method, decorated with@classmethod
, creates a new instance from the provided dictionary. Additionally, the GitHub Actions workflow for acceptance tests has been updated to include theready_for_review
event type, ensuring that tests run not only for opened and synchronized pull requests but also when marked as "ready for review." These changes provide developers with more control over the deserialization process and facilitate debugging in cases where default deserialization fails, but should be used judiciously to avoid brittle code. - Fixed nightly integration tests run as service principals (#52). In this release, we have enhanced the compatibility of our codebase with service principals, particularly in the context of nightly integration tests. The
Installation
class in thedatabricks.labs.blueprint.installation
module has been refactored, deprecating thecurrent
method and introducing two new methods:assume_global
andassume_user_home
. These methods enable users to install and manageblueprint
as either a global or user-specific installation. Additionally, theexisting
method has been updated to work with the newInstallation
methods. In the test suite, thetest_installation.py
file has been updated to correctly detect global and user-specific installations when running as a service principal. These changes improve the testability and functionality of our software, ensuring seamless operation with service principals during nightly integration tests. - Made
test_existing_installations_are_detected
more resilient (#51). In this release, we have added a new test functiontest_existing_installations_are_detected
that checks if existing installations are correctly detected and retries the test for up to 15 seconds if they are not. This improves the reliability of the test by making it more resilient to potential intermittent failures. We have also added an import fromdatabricks.sdk.retries
namedretried
which is used to retry the test function in case of anAssertionError
. Additionally, the test functiontest_existing
has been renamed totest_existing_installations_are_detected
and thexfail
marker has been removed. We have also renamed the test functiontest_dataclass
totest_loading_dataclass_from_installation
for better clarity. This change will help ensure that the library is correctly detecting existing installations and improve the overall quality of the codebase.
- Automatically enable workspace filesystem if the feature is disabled (#42).
- Added special handling for notebooks in
Installation.upload(...)
(#36).
- Fixed issues with uploading wheels to DBFS and loading a non-existing install state (#34).
- Aligned
Installation
framework with UCX project (#32).
- Added common install state primitives with strong typing (#27).
- Added documentation for Invoking Databricks Connect (#28).
- Added more documentation for Databricks CLI command router (#30).
- Enforced
pylint
standards (#29).
- Changed python requirement from 3.10.6 to 3.10 (#25).
- Make
find_project_root
more deterministic (#23).
- Make it work with
ucx
(#21).
- Fixed sigstore action (#19).
- Sign artifacts with Sigstore (#17).