forked from databrickslabs/ucx
-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Patch export v2 #16
Merged
Merged
Patch export v2 #16
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
## Changes This PR makes some minor changes to the the contributor documentation: - Hatch/env setup tweak to keep IntelliJ/PyCharm happy. (For some reason if the full path isn't specified IntelliJ can have problems locating the python interpreter for the venv.) - ~Add in the linting step.~
…ckslabs#2733) ## Changes Handle `PermissionDenied` when listing accessible workspaces ### Linked issues Resolves databrickslabs#2732 ### Functionality - [x] modified existing command: that user the accesible workspaces (most account level commands) ### Tests - [ ] manually tested - [x] added unit tests
…ickslabs#2727) 🔧 This is still in progress... <!-- REMOVE IRRELEVANT COMMENTS BEFORE CREATING A PULL REQUEST --> ## Changes <!-- Summary of your changes that are easy to understand. Add screenshots when necessary --> ### Linked issues <!-- DOC: Link issue with a keyword: close, closes, closed, fix, fixes, fixed, resolve, resolves, resolved. See https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword --> Resolves databrickslabs#1938 ### Functionality - [ ] added relevant user documentation - [x] added new CLI command --> unskip ### Tests <!-- How is this tested? Please see the checklist below and also describe any other relevant tests --> - [ ] manually tested - [ ] added unit tests - [ ] added integration tests
…ckslabs#2736) ## Changes Ensure 'assessment' workflow only runs minimal assessment in integration tests ### Linked issues None ### Functionality None ### Tests - [x] changed integration tests Co-authored-by: Eric Vergnaud <eric.vergnaud@databricks.com>
… commands (databrickslabs#2738) <!-- REMOVE IRRELEVANT COMMENTS BEFORE CREATING A PULL REQUEST --> ## Changes Updated the doc to explain the concept of collection and how to use the join-collection cmd and the related collection-eligible commands <!-- Summary of your changes that are easy to understand. Add screenshots when necessary --> ### Functionality - [ ] added relevant user documentation
… catalog/schema acl from legacy hive_metastore (databrickslabs#2676) <!-- REMOVE IRRELEVANT COMMENTS BEFORE CREATING A PULL REQUEST --> ## Changes <!-- Summary of your changes that are easy to understand. Add screenshots when necessary --> ### Linked issues <!-- DOC: Link issue with a keyword: close, closes, closed, fix, fixes, fixed, resolve, resolves, resolved. See https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword --> Resolves databrickslabs#2514 ### Functionality - [ ] modified existing command: `databricks labs ucx create-catalog-schemas` ### Tests <!-- How is this tested? Please see the checklist below and also describe any other relevant tests --> - [ ] added unit tests - [ ] added integration tests
## Changes Add `create-ucx-catalog` cli command to create the catalog (going to be) used for migration tracking (possibly) accross multiple workspaces. ### Linked issues Resolves databrickslabs#2571 ### Functionality - [x] added relevant user documentation - [x] added new CLI command: `create-ucx-catalog` ### Tests - [x] manually tested - [x] added unit tests - [x] added integration tests --------- Co-authored-by: Andrew Snare <asnare@users.noreply.github.com>
…slabs#2741) <!-- REMOVE IRRELEVANT COMMENTS BEFORE CREATING A PULL REQUEST --> ## Changes <!-- Summary of your changes that are easy to understand. Add screenshots when necessary --> Resolves databrickslabs#2529 ### Functionality - [ ] modified existing command: `databricks labs ucx migrate-locations` ### Tests <!-- How is this tested? Please see the checklist below and also describe any other relevant tests --> - [ ] added unit tests - [ ] added integration tests --------- Co-authored-by: Amin Movahed <amin.movahed@databricks.com> Co-authored-by: Amin Movahed <140028681+aminmovahed-db@users.noreply.github.com>
…slabs#2696) ## Changes Add job/query problem widgets to the dashboard Add directfs access widget to the dashboard ### Linked issues Resolves databrickslabs#2595 ### Functionality None ### Tests - [x] added integration tests using mock data - [x] manually tested widgets, see below: https://github.com/user-attachments/assets/3684c30f-761a-4de6-bc67-de650c5d5353 --------- Co-authored-by: Eric Vergnaud <eric.vergnaud@databricks.com> Co-authored-by: Serge Smertin <259697+nfx@users.noreply.github.com>
## Changes Increases coverage ### Functionality - [x] fixes and modifies testing warnings ### Tests - [x] modified unit tests
…kslabs#2726) ## Changes The alias for the source table is disappearing while converting the create view sql from legacy hive metastore to uc catalog. ### Linked issues Resolves databrickslabs#2661 ### Functionality - [x] fixes sql conversion ### Tests - [x] manually tested - [x] added unit tests - [ ] added integration tests - [ ] verified on staging environment (screenshot attached) --------- Co-authored-by: Liran Bareket <liran.bareket@databricks.com>
…databrickslabs#2746) ## Changes Our code works around a limitation of astroid < 3.3 where f-strings are not inferred This PR: - updates pylint and astroid - drops workarounds - fixes corresponding tests ### Linked issues None ### Functionality None ### Tests - [x] updated unit tests --------- Co-authored-by: Eric Vergnaud <eric.vergnaud@databricks.com>
…,<0.10 (databrickslabs#2747) Updates the requirements on [databricks-labs-blueprint](https://github.com/databrickslabs/blueprint) to permit the latest version. <details> <summary>Changelog</summary> <p><em>Sourced from <a href="https://github.com/databrickslabs/blueprint/blob/main/CHANGELOG.md">databricks-labs-blueprint's changelog</a>.</em></p> <blockquote> <h2>0.9.0</h2> <ul> <li>Added Databricks CLI version as part of routed command telemetry (<a href="https://redirect.github.com/databrickslabs/blueprint/issues/147">#147</a>). A new environment variable, "DATABRICKS_CLI_VERSION", has been introduced in the Databricks CLI version for routed command telemetry. This variable is incorporated into the <code>with_user_agent_extra</code> method, which adds it to the user agent for outgoing requests, thereby enhancing detailed tracking and version identification in telemetry data. The <code>with_user_agent_extra</code> method is invoked twice, with the <code>blueprint</code> prefix and the <strong>version</strong> variable, followed by the <code>cli</code> prefix and the DATABRICKS_CLI_VERSION environment variable, ensuring that both the blueprint and CLI versions are transmitted in the user agent for all requests.</li> </ul> <h2>0.8.3</h2> <ul> <li>add missing stat() methods to DBFSPath and WorkspacePath (<a href="https://redirect.github.com/databrickslabs/blueprint/issues/144">#144</a>). The <code>stat()</code> method has been added to both <code>DBFSPath</code> and <code>WorkspacePath</code> classes, addressing issues <a href="https://redirect.github.com/databrickslabs/blueprint/issues/142">#142</a> and <a href="https://redirect.github.com/databrickslabs/blueprint/issues/143">#143</a>. This method, which adheres to the Posix standard, returns file status in the <code>os.stat_result</code> format, providing access to various metadata attributes such as file size, last modification time, and creation time. By incorporating this method, developers can now obtain essential file information for Databricks File System (DBFS) and Databricks Workspace paths when working with these classes. The change includes a new test case for <code>stat()</code> in the <code>test_paths.py</code> file to ensure the correctness of the method for both classes.</li> </ul> <h2>0.8.2</h2> <ul> <li>Make hatch a prerequisite (<a href="https://redirect.github.com/databrickslabs/blueprint/issues/137">#137</a>). In version 1.9.4, hatch has become a prerequisite for installation in the GitHub workflow for the project's main branch, due to occasional failures in <code>pip install hatch</code> that depend on the local environment. This change, which includes defining the hatch version as an environment variable and adding a new step for installing hatch with a specific version, aims to enhance the reliability of the build and testing process by eliminating potential installation issues with hatch. Users should install hatch manually before executing the Makefile, as the line <code>pip install hatch</code> has been removed from the Makefile. This change aligns with the approach taken for ucx, and users are expected to understand the requirement to install prerequisites before executing the Makefile. To contribute to this project, please install hatch using <code>pip install hatch</code>, clone the GitHub repository, and run <code>make dev</code> to start the development environment and install necessary dependencies.</li> <li>support files with unicode BOM (<a href="https://redirect.github.com/databrickslabs/blueprint/issues/138">#138</a>). The recent change to the open-source library introduces support for handling files with a Unicode Byte Order Mark (BOM) during file upload and download operations in Databricks Workspace. This new functionality, added to the <code>WorkspacePath</code> class, allows for easier reading of text from files with the addition of a <code>read_text</code> method. When downloading a file, if it starts with a BOM, it will be detected and used for decoding, regardless of the preferred encoding based on the system's locale. The change includes a new test function that verifies the accurate encoding and decoding of files with different types of BOM using the appropriate encoding. Despite the inability to test Databrick notebooks with a BOM due to the Databricks platform modifying the uploaded data, this change enhances support for handling files with various encodings and BOM, improving compatibility with a broader range of file formats, and ensuring more accurate handling of files with BOM.</li> </ul> <h2>0.8.1</h2> <ul> <li>Fixed py3.10 compatibility for <code>_parts</code> in pathlike (<a href="https://redirect.github.com/databrickslabs/blueprint/issues/135">#135</a>). The recent update to our open-source library addresses the compatibility issue with Python 3.10 in the <code>_parts</code> property of a certain type. Prior to this change, there was also a <code>_cparts</code> property that returned the same value as <code>_parts</code>, which has been removed and replaced with a direct reference to <code>_parts</code>. The <code>_parts</code> property can now be accessed via reverse equality comparison, and this change has been implemented in the <code>joinpath</code> and <code>__truediv__</code> methods as well. This enhancement improves the library's compatibility with Python 3.10 and beyond, ensuring continued functionality and stability for software engineers working with the latest Python versions.</li> </ul> <h2>0.8.0</h2> <ul> <li>Added <code>DBFSPath</code> as <code>os.PathLike</code> implementation (<a href="https://redirect.github.com/databrickslabs/blueprint/issues/131">#131</a>). The open-source library has been updated with a new class <code>DBFSPath</code>, an implementation of <code>os.PathLike</code> for Databricks File System (DBFS) paths. This new class extends the existing <code>WorkspacePath</code> support and provides pathlib-like functionality for DBFS paths, including methods for creating directories, renaming and deleting files and directories, and reading and writing files. The addition of <code>DBFSPath</code> includes type-hinting for improved code linting and is integrated in the test suite with new and updated tests for path-like objects. The behavior of the <code>exists</code> and <code>unlink</code> methods have been updated for <code>WorkspacePath</code> to improve performance and raise appropriate errors.</li> <li>Fixed <code>.as_uri()</code> and <code>.absolute()</code> implementations for <code>WorkspacePath</code> (<a href="https://redirect.github.com/databrickslabs/blueprint/issues/127">#127</a>). In this release, the <code>WorkspacePath</code> class in the <code>paths.py</code> module has been updated with several improvements to the <code>.as_uri()</code> and <code>.absolute()</code> methods. These methods now utilize PathLib internals, providing better cross-version compatibility. The <code>.as_uri()</code> method now uses an f-string for concatenation and returns the UTF-8 encoded string representation of the <code>WorkspacePath</code> object via a new <code>__bytes__()</code> dunder method. Additionally, the <code>.absolute()</code> method has been implemented for the trivial (no-op) case and now supports returning the absolute path of files or directories in Databricks Workspace. Furthermore, the <code>glob()</code> and <code>rglob()</code> methods have been enhanced to support case-sensitive pattern matching based on a new <code>case_sensitive</code> parameter. To ensure the integrity of these changes, two new test cases, <code>test_as_uri()</code> and <code>test_absolute()</code>, have been added, thoroughly testing the functionality of these methods.</li> <li>Fixed <code>WorkspacePath</code> support for python 3.11 (<a href="https://redirect.github.com/databrickslabs/blueprint/issues/121">#121</a>). The <code>WorkspacePath</code> class in our open-source library has been updated to improve compatibility with Python 3.11. The <code>.expanduser()</code> and <code>.glob()</code> methods have been modified to address internal changes in Python 3.11. The <code>is_dir()</code> and <code>is_file()</code> methods now include a <code>follow_symlinks</code> parameter, although it is not currently used. A new method, <code>_scandir()</code>, has been added for compatibility with Python 3.11. The <code>expanduser()</code> method has also been updated to expand <code>~</code> (but not <code>~user</code>) constructs. Additionally, a new method <code>is_notebook()</code> has been introduced to check if the path points to a notebook in Databricks Workspace. These changes aim to ensure that the library functions smoothly with the latest version of Python and provides additional functionality for users working with Databricks Workspace.</li> <li>Properly verify versions of python (<a href="https://redirect.github.com/databrickslabs/blueprint/issues/118">#118</a>). In this release, we have made significant updates to the pyproject.toml file to enhance project dependency and development environment management. We have added several new packages to the <code>dependencies</code> section to expand the library's functionality and compatibility. Additionally, we have removed the <code>python</code> field, as it is no longer necessary. We have also updated the <code>path</code> field to specify the location of the virtual environment, which can improve integration with popular development tools such as Visual Studio Code and PyCharm. These changes are intended to streamline the development process and make it easier to manage dependencies and set up the development environment.</li> <li>Type annotations on path-related unit tests (<a href="https://redirect.github.com/databrickslabs/blueprint/issues/128">#128</a>). In this open-source library update, type annotations have been added to path-related unit tests to enhance code clarity and maintainability. The tests encompass various scenarios, including verifying if a path exists, creating, removing, and checking directories, and testing file attributes such as distinguishing directories, notebooks, and regular files. The additions also cover functionality for opening and manipulating files in different modes like read binary, write binary, read text, and write text. Furthermore, tests for checking file permissions, handling errors, and globbing (pattern-based file path matching) have been incorporated. The tests interact with a WorkspaceClient mock object, simulating file system interactions. This enhancement bolsters the library's reliability and assists developers in creating robust, well-documented code when working with file system paths.</li> <li>Updated <code>WorkspacePath</code> to support Python 3.12 (<a href="https://redirect.github.com/databrickslabs/blueprint/issues/122">#122</a>). In this release, the <code>WorkspacePath</code> implementation has been updated to ensure compatibility with Python 3.12, in addition to Python 3.10 and 3.11. The class was modified to replace most of the internal implementation and add extensive tests for public interfaces, ensuring that the superclass implementations are not used unless they are known to be safe. This change is in response to the significant changes in the superclass implementations between Python 3.11 and 3.12, which were found to be incompatible with each other. The <code>WorkspacePath</code> class now includes several new methods and tests to ensure that it functions seamlessly with different versions of Python. These changes include testing for initialization, equality, hash, comparison, path components, and various path manipulations. This update enhances the library's adaptability and ensures it functions correctly with different versions of Python. Classifiers have also been updated to include support for Python 3.12.</li> <li><code>WorkspacePath</code> fixes for the <code>.resolve()</code> implementation (<a href="https://redirect.github.com/databrickslabs/blueprint/issues/129">#129</a>). The <code>.resolve()</code> method for <code>WorkspacePath</code> has been updated to improve its handling of relative paths and the <code>strict</code> argument. Previously, relative paths were not properly validated and would be returned as-is. Now, relative paths will cause the method to fail. The <code>strict</code> argument is now checked, and if set to <code>True</code> and the path does not exist, a <code>FileNotFoundError</code> will be raised. The method <code>.absolute()</code> is used to obtain the absolute path of the file or directory in Databricks Workspace and is used in the implementation of <code>.resolve()</code>. A new test, <code>test_resolve()</code>, has been added to verify these changes, covering scenarios where the path is absolute, the path exists, the path does not exist, and the path is relative. In the case of relative paths, a <code>NotImplementedError</code> is raised, as <code>.resolve()</code> is not supported for them.</li> <li><code>WorkspacePath</code>: Fix the .rename() and .replace() implementations to return the target path (<a href="https://redirect.github.com/databrickslabs/blueprint/issues/130">#130</a>). The <code>.rename()</code> and <code>.replace()</code> methods of the <code>WorkspacePath</code> class have been updated to return the target path as part of the public API, with <code>.rename()</code> no longer accepting the <code>overwrite</code> keyword argument and always failing if the target path already exists. A new private method, <code>._rename()</code>, has been added to include the <code>overwrite</code> argument and is used by both <code>.rename()</code> and <code>.replace()</code>. This update is a preparatory step for factoring out common code to support DBFS paths. The tests have been updated accordingly, combining and adding functions to test the new and updated methods. The <code>.unlink()</code> method's behavior remains unchanged. Please note that the exact error raised when <code>.rename()</code> fails due to an existing target path is yet to be defined.</li> </ul> <p>Dependency updates:</p> <ul> <li>Bump sigstore/gh-action-sigstore-python from 2.1.1 to 3.0.0 (<a href="https://redirect.github.com/databrickslabs/blueprint/pull/133">#133</a>).</li> </ul> <h2>0.7.0</h2> <ul> <li>Added <code>databricks.labs.blueprint.paths.WorkspacePath</code> as <code>pathlib.Path</code> equivalent (<a href="https://redirect.github.com/databrickslabs/blueprint/issues/115">#115</a>). This commit introduces the <code>databricks.labs.blueprint.paths.WorkspacePath</code> library, providing Python-native <code>pathlib.Path</code>-like interfaces to simplify working with Databricks Workspace paths. The library includes <code>WorkspacePath</code> and <code>WorkspacePathDuringTest</code> classes offering advanced functionality for handling user home folders, relative file paths, browser URLs, and file manipulation methods such as <code>read/write_text()</code>, <code>read/write_bytes()</code>, and <code>glob()</code>. This addition brings enhanced, Pythonic ways to interact with Databricks Workspace paths, including creating and moving files, managing directories, and generating browser-accessible URIs. Additionally, the commit includes updates to existing methods and introduces new fixtures for creating notebooks, accompanied by extensive unit tests to ensure reliability and functionality.</li> <li>Added propagation of <code>blueprint</code> version into <code>User-Agent</code> header when it is used as library (<a href="https://redirect.github.com/databrickslabs/blueprint/issues/114">#114</a>). A new feature has been introduced in the library that allows for the propagation of the <code>blueprint</code> version and the name of the command line interface (CLI) command used in the <code>User-Agent</code> header when the library is utilized as a library. This feature includes the addition of two new pairs of <code>OtherInfo</code>: <code>blueprint/X.Y.Z</code> to indicate that the request is made using the <code>blueprint</code> library and <code>cmd/<name></code> to store the name of the CLI command used for making the request. The implementation involves using the <code>with_user_agent_extra</code> function from <code>databricks.sdk.config</code> to set the user agent consistently with the Databricks CLI. Several changes have been made to the test file for <code>test_useragent.py</code> to include a new test case, <code>test_user_agent_is_propagated</code>, which checks if the <code>blueprint</code> version and the name of the command are correctly propagated to the <code>User-Agent</code> header. A context manager <code>http_fixture_server</code> has been added that creates an HTTP server with a custom handler, which extracts the <code>blueprint</code> version and the command name from the <code>User-Agent</code> header and stores them in the <code>user_agent</code> dictionary. The test case calls the <code>foo</code> command with a mocked <code>WorkspaceClient</code> instance and sets the <code>DATABRICKS_HOST</code> and <code>DATABRICKS_TOKEN</code> environment variables to test the propagation of the <code>blueprint</code> version and the command name in the <code>User-Agent</code> header. The test case then asserts that the <code>blueprint</code> version and the name of the command are present and correctly set in the <code>user_agent</code> dictionary.</li> <li>Bump actions/checkout from 4.1.6 to 4.1.7 (<a href="https://redirect.github.com/databrickslabs/blueprint/issues/112">#112</a>). In this release, the version of the "actions/checkout" action used in the <code>Checkout Code</code> step of the acceptance workflow has been updated from 4.1.6 to 4.1.7. This update may include bug fixes, performance improvements, and new features, although specific changes are not mentioned in the commit message. The <code>Unshallow</code> step remains unchanged, continuing to fetch and clean up the repository's history. This update ensures that the latest enhancements from the "actions/checkout" action are utilized, aiming to improve the reliability and performance of the code checkout process in the GitHub Actions workflow. Software engineers should be aware of this update and its potential impact on their workflows.</li> </ul> <p>Dependency updates:</p> <ul> <li>Bump actions/checkout from 4.1.6 to 4.1.7 (<a href="https://redirect.github.com/databrickslabs/blueprint/pull/112">#112</a>).</li> </ul> <h2>0.6.3</h2> <ul> <li>fixed <code>Command.get_argument_type</code> bug with <code>UnionType</code> (<a href="https://redirect.github.com/databrickslabs/blueprint/issues/110">#110</a>). In this release, the <code>Command.get_argument_type</code> method has been updated to include special handling for <code>UnionType</code>, resolving a bug that caused the function to crash when encountering this type. The method now returns the string representation of the annotation if the argument is a <code>UnionType</code>, providing more accurate and reliable results. To facilitate this, modifications were made using the <code>types</code> module. Additionally, the <code>foo</code> function has a new optional argument <code>optional_arg</code> of type <code>str</code>, with a default value of <code>None</code>. This argument is passed to the <code>some</code> function in the assertion. The <code>Prompts</code> type has been added to the <code>foo</code> function signature, and an assertion has been added to verify if <code>prompts</code> is an instance of <code>Prompts</code>. Lastly, the default value of the <code>address</code> argument has been changed from an empty string to "default", and the same changes have been applied to the <code>test_injects_prompts</code> test function.</li> </ul> <!-- raw HTML omitted --> </blockquote> <p>... (truncated)</p> </details> <details> <summary>Commits</summary> <ul> <li><a href="https://github.com/databrickslabs/blueprint/commit/c3b53a48471b7c3d9ff911d7c6cc3921d6dd9846"><code>c3b53a4</code></a> Release v0.9.0 (<a href="https://redirect.github.com/databrickslabs/blueprint/issues/148">#148</a>)</li> <li><a href="https://github.com/databrickslabs/blueprint/commit/98c5f305e721b7f9b2db88db7ad062481e4191dd"><code>98c5f30</code></a> Added Databricks CLI version as part of routed command telemetry (<a href="https://redirect.github.com/databrickslabs/blueprint/issues/147">#147</a>)</li> <li><a href="https://github.com/databrickslabs/blueprint/commit/2bfbf1801c1f8638dfadd5c072daeb4cbb9fa372"><code>2bfbf18</code></a> Release v0.8.3 (<a href="https://redirect.github.com/databrickslabs/blueprint/issues/145">#145</a>)</li> <li><a href="https://github.com/databrickslabs/blueprint/commit/36fc873c0f9795e293d4716916fb96ed31680240"><code>36fc873</code></a> add missing stat() methods to DBFSPath and WorkspacePath (<a href="https://redirect.github.com/databrickslabs/blueprint/issues/144">#144</a>)</li> <li><a href="https://github.com/databrickslabs/blueprint/commit/c531c3f5627d15057d0ae6e570140dedbed968ef"><code>c531c3f</code></a> Release v0.8.2 (<a href="https://redirect.github.com/databrickslabs/blueprint/issues/139">#139</a>)</li> <li><a href="https://github.com/databrickslabs/blueprint/commit/53b94634639e673eeac880b005e9e64981259035"><code>53b9463</code></a> support files with unicode BOM (<a href="https://redirect.github.com/databrickslabs/blueprint/issues/138">#138</a>)</li> <li><a href="https://github.com/databrickslabs/blueprint/commit/ec8232664d4f45e2233404c7ab414d7c3393db1e"><code>ec82326</code></a> Make hatch a prerequisite (<a href="https://redirect.github.com/databrickslabs/blueprint/issues/137">#137</a>)</li> <li><a href="https://github.com/databrickslabs/blueprint/commit/98e75bcffd72dd3075a772947e0d06042ba81f6a"><code>98e75bc</code></a> Release v0.8.1 (<a href="https://redirect.github.com/databrickslabs/blueprint/issues/136">#136</a>)</li> <li><a href="https://github.com/databrickslabs/blueprint/commit/821bc0adb4438e211586af91d3ea011bf97115cf"><code>821bc0a</code></a> Fixed py3.10 compatibility for <code>_parts</code> in pathlike (<a href="https://redirect.github.com/databrickslabs/blueprint/issues/135">#135</a>)</li> <li>See full diff in <a href="https://github.com/databrickslabs/blueprint/compare/v0.8.0...v0.9.0">compare view</a></li> </ul> </details> <br /> Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) --- <details> <summary>Dependabot commands and options</summary> <br /> You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) </details> Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
## Changes solacc.py currently lints the entire solacc repo, thus accumulating temporary files to a point that exceeds CI storage capacity This PR fixes the issue by: - lint the repo on a per top-level solacc 'solution' (within the repo, top folders are independent of each other) - delete temp files and dirs registered in PathLookup after linting a solution This PR also prepares for improving false positive detection ### Linked issues None ### Functionality None ### Tests - [x] manually tested --------- Co-authored-by: Eric Vergnaud <eric.vergnaud@databricks.com>
## Changes This PR just includes changes from `make fmt`, which doesn't currently pass on `main`. ### Linked issues Updates databrickslabs#2746.
…n parallel (databrickslabs#2745) We were not doing that before and now we do.
## Changes Harden configuration reading by verifying the type before reading the "value" using `.get` ### Linked issues Resolves databrickslabs#2581 (hopefully the second get is the issue, type hinting should cover that, but who knows) ### Functionality - [x] modified existing workflow: `assessment` ### Tests - [x] added unit tests
…slabs#2734) ## Changes Add unskip CLI command to undo a skip on schema or a table ### Linked issues Resolves databrickslabs#1938 ### Functionality - [x] added relevant user documentation - [x] added new CLI command --> unskip ### Tests <!-- How is this tested? Please see the checklist below and also describe any other relevant tests --> - [x] Unit test added
## Changes `solacc` currently lints on a per-file basis, which is incorrect this PR implements linting a per solution basis, thus improving dependency resolution ### Linked issues None ### Functionality None ### Tests - [x] manually tested --------- Co-authored-by: Eric Vergnaud <eric.vergnaud@databricks.com>
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Sync Fork, make fmt test completed.