Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Release v0.10.0 #274

Merged
merged 1 commit into from
Sep 11, 2024
Merged

Release v0.10.0 #274

merged 1 commit into from
Sep 11, 2024

Conversation

nfx
Copy link
Collaborator

@nfx nfx commented Sep 11, 2024

  • Added Functionality to export any dashboards-as-code into CSV (#269). The DashboardMetadata class now includes a new method, export_to_zipped_csv, which enables exporting any dashboard as CSV files in a ZIP archive. This method accepts sql_backend and export_path as parameters and exports dashboard queries to CSV files in the specified ZIP archive by iterating through tiles and fetching dashboard queries if the tile is a query. To ensure the proper functioning of this feature, unit tests and manual testing have been conducted. A new test, test_dashboards_export_to_zipped_csv, has been added to verify the correct export of dashboard data to a CSV file.
  • Added support for generic types in SqlBackend (#272). In this release, we've added support for using rich dataclasses, including those with optional and generic types, in the SqlBackend of the StatementExecutionBackend class. The new functionality is demonstrated in the test_supports_complex_types unit test, which creates a Nested dataclass containing various complex data types, such as nested dataclasses, datetime objects, dict, list, and optional fields. This enhancement is achieved by updating the save_table method to handle the conversion of complex dataclasses to SQL statements. To facilitate type inference, we've introduced a new StructInference class that converts Python dataclasses and built-in types to their corresponding SQL Data Definition Language (DDL) representations. This addition simplifies data definition and manipulation operations while maintaining type safety and compatibility with various SQL data types.

* Added Functionality to export any dashboards-as-code into CSV ([#269](#269)). The `DashboardMetadata` class now includes a new method, `export_to_zipped_csv`, which enables exporting any dashboard as CSV files in a ZIP archive. This method accepts `sql_backend` and `export_path` as parameters and exports dashboard queries to CSV files in the specified ZIP archive by iterating through tiles and fetching dashboard queries if the tile is a query. To ensure the proper functioning of this feature, unit tests and manual testing have been conducted. A new test, `test_dashboards_export_to_zipped_csv`, has been added to verify the correct export of dashboard data to a CSV file.
* Added support for generic types in `SqlBackend` ([#272](#272)). In this release, we've added support for using rich dataclasses, including those with optional and generic types, in the `SqlBackend` of the `StatementExecutionBackend` class. The new functionality is demonstrated in the `test_supports_complex_types` unit test, which creates a `Nested` dataclass containing various complex data types, such as nested dataclasses, `datetime` objects, `dict`, `list`, and optional fields. This enhancement is achieved by updating the `save_table` method to handle the conversion of complex dataclasses to SQL statements. To facilitate type inference, we've introduced a new `StructInference` class that converts Python dataclasses and built-in types to their corresponding SQL Data Definition Language (DDL) representations. This addition simplifies data definition and manipulation operations while maintaining type safety and compatibility with various SQL data types.
@nfx nfx merged commit 82826ab into main Sep 11, 2024
6 of 8 checks passed
@nfx nfx deleted the prepare/0.10.0 branch September 11, 2024 16:31
Copy link

❌ 34/35 passed, 1 failed, 3 skipped, 8m24s total

❌ test_appends_complex_types: databricks.sdk.errors.platform.BadRequest: [INSUFFICIENT_PERMISSIONS] Insufficient privileges: (672ms)
databricks.sdk.errors.platform.BadRequest: [INSUFFICIENT_PERMISSIONS] Insufficient privileges:
User does not have permission CREATE,USAGE on database `TEST_SCHEMA`. SQLSTATE: 42501
16:32 DEBUG [databricks.sdk] Loaded from environment
16:32 DEBUG [databricks.sdk] Ignoring pat auth, because metadata-service is preferred
16:32 DEBUG [databricks.sdk] Ignoring basic auth, because metadata-service is preferred
16:32 DEBUG [databricks.sdk] Attempting to configure auth: metadata-service
16:32 INFO [databricks.sdk] Using Databricks Metadata Service authentication
[gw3] linux -- Python 3.10.14 /home/runner/work/lsql/lsql/.venv/bin/python
16:32 DEBUG [databricks.sdk] Loaded from environment
16:32 DEBUG [databricks.sdk] Ignoring pat auth, because metadata-service is preferred
16:32 DEBUG [databricks.sdk] Ignoring basic auth, because metadata-service is preferred
16:32 DEBUG [databricks.sdk] Attempting to configure auth: metadata-service
16:32 INFO [databricks.sdk] Using Databricks Metadata Service authentication
16:32 DEBUG [databricks.labs.lsql.backends] [api][execute] CREATE TABLE IF NOT EXISTS hive_metastore.TEST_SCHEMA.taHyf (foo STRUCT<first:STRING,second:BOOLEAN>... (134 more bytes)
16:32 DEBUG [databricks.labs.lsql.core] Executing SQL statement: CREATE TABLE IF NOT EXISTS hive_metastore.TEST_SCHEMA.taHyf (foo STRUCT<first:STRING,second:BOOLEAN> NOT NULL, since DATE NOT NULL, created TIMESTAMP NOT NULL, mapping MAP<STRING,LONG> NOT NULL, array ARRAY<LONG> NOT NULL) USING DELTA
16:32 DEBUG [databricks.sdk] POST /api/2.0/sql/statements/
> {
>   "format": "JSON_ARRAY",
>   "statement": "CREATE TABLE IF NOT EXISTS hive_metastore.TEST_SCHEMA.taHyf (foo STRUCT<first:STRING,second:BOOLEAN>... (134 more bytes)",
>   "warehouse_id": "TEST_DEFAULT_WAREHOUSE_ID"
> }
< 200 OK
< {
<   "statement_id": "01ef705b-6911-1d57-b10f-330e5df34ab7",
<   "status": {
<     "error": {
<       "error_code": "BAD_REQUEST",
<       "message": "[INSUFFICIENT_PERMISSIONS] Insufficient privileges:\nUser does not have permission CREATE,USAGE o... (37 more bytes)"
<     },
<     "state": "FAILED"
<   }
< }
16:32 DEBUG [databricks.sdk] Loaded from environment
16:32 DEBUG [databricks.sdk] Ignoring pat auth, because metadata-service is preferred
16:32 DEBUG [databricks.sdk] Ignoring basic auth, because metadata-service is preferred
16:32 DEBUG [databricks.sdk] Attempting to configure auth: metadata-service
16:32 INFO [databricks.sdk] Using Databricks Metadata Service authentication
16:32 DEBUG [databricks.labs.lsql.backends] [api][execute] CREATE TABLE IF NOT EXISTS hive_metastore.TEST_SCHEMA.taHyf (foo STRUCT<first:STRING,second:BOOLEAN>... (134 more bytes)
16:32 DEBUG [databricks.labs.lsql.core] Executing SQL statement: CREATE TABLE IF NOT EXISTS hive_metastore.TEST_SCHEMA.taHyf (foo STRUCT<first:STRING,second:BOOLEAN> NOT NULL, since DATE NOT NULL, created TIMESTAMP NOT NULL, mapping MAP<STRING,LONG> NOT NULL, array ARRAY<LONG> NOT NULL) USING DELTA
16:32 DEBUG [databricks.sdk] POST /api/2.0/sql/statements/
> {
>   "format": "JSON_ARRAY",
>   "statement": "CREATE TABLE IF NOT EXISTS hive_metastore.TEST_SCHEMA.taHyf (foo STRUCT<first:STRING,second:BOOLEAN>... (134 more bytes)",
>   "warehouse_id": "TEST_DEFAULT_WAREHOUSE_ID"
> }
< 200 OK
< {
<   "statement_id": "01ef705b-6911-1d57-b10f-330e5df34ab7",
<   "status": {
<     "error": {
<       "error_code": "BAD_REQUEST",
<       "message": "[INSUFFICIENT_PERMISSIONS] Insufficient privileges:\nUser does not have permission CREATE,USAGE o... (37 more bytes)"
<     },
<     "state": "FAILED"
<   }
< }
[gw3] linux -- Python 3.10.14 /home/runner/work/lsql/lsql/.venv/bin/python

Running from acceptance #384

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant