-
Notifications
You must be signed in to change notification settings - Fork 130
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Make workspace client also return runtime dbutils when in dbr #210
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM! I suppose this is as good a test as any for whether we are in DBR or not.
Suggestion: let's not include internal ticket IDs in PR descriptions in public repos. |
@@ -17,6 +17,22 @@ from databricks.sdk.service.{{.Package.Name}} import {{.PascalName}}API{{end}} | |||
{{- getOrDefault $mixins $genApi $genApi -}} | |||
{{- end -}} | |||
|
|||
def _make_dbutils(config: client.Config): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
i'm not sure it's the intended behavior, as we want to migrate users away from dbutils to SDK. this PR makes people assume that dbutils will stay there forever
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Agreed. But there is a currently a huge functionality gap between the sdk implementation and dbutils on runtime.
- Having the 2 entrypoints (
from databricks.sdk.runtime import *
and workspace_client.dbutils) provide different implementations can be confusing for the user. - Also, it doesn't make sense to use the command-exec api (for proxyutils) when we are already in the runtime. And there is functionality (such as widgets) which will necessarily work differently in runtime and sdk.
If we are ok with putting the responsibility on the user to understand what dbutils they are using (1.), we can address 2 by selectively proxying to the dbr dbutils when in dbr (instead of using command exec or different implementations for widgets).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@xiaochen-db @falaki - what do you guys think?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sounds like we are reversing from the former direction where REPL should eventually adopt the SDK dbutils despite current function gaps to maintaining two versions of dbutils: the SDK one and the DBR one, and SDK one will fall back on DBR if it's used in a REPL.
I am okay with that given that the function gaps seem unresolvable in the near term. My another question is that why do you only update the dbutils from WorkspaceClient
? Is there other way for a user to create dbutils.RemoteDbUtils(config)
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We expose dbutils through 2 paths
from databricks.sdk.runtime import *
- workspaceClient.dbutils
The first one already falls back to the dbr version of dbutils. WorkspaceClient does not.
Good point about dbutils.RemoteDbUtils(config)
. It is possible. It currently looks like this
from databricks.sdk import dbutils
dbutils.RemoteDbUtils(config).fs.ls(".")
We should rename this to _dbutils
to make it clear that this is an internal implementation, and users are not expected to initialise it. Wdyt @nfx ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we also please type hint the return type of this correctly?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@judahrand this will already work with the changes you made. This will return a union of types, which will include the full typing information you added to sdk.runtime
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I meant type hinting in the code 🤷♂️ No reason not to properly type hint all Python code these days.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This function is still missing a return type annotation 😢
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@judahrand it is difficult to type this function, because we cannot know what type of dbutils to use (the dbr or the OSS one) until we actually try importing dbutils from dbr. So this function can't be typed until this function runs.
We do not want to mess with dbr typing.
…tabricks-sdk-py into fallback-dbutils-apiclient
Codecov ReportPatch coverage:
Additional details and impacted files@@ Coverage Diff @@
## main #210 +/- ##
=======================================
Coverage 53.23% 53.24%
=======================================
Files 32 32
Lines 19138 19145 +7
=======================================
+ Hits 10188 10193 +5
- Misses 8950 8952 +2
☔ View full report in Codecov by Sentry. |
* Add Issue Templates ([#208](#208)). * Fix error message, ExportFormat -> ImportFormat ([#220](#220)). * Fixed notebook native auth for jobs ([#209](#209)). * Make workspace client also return runtime dbutils when in dbr ([#210](#210)). * Regenerate Python SDK using recent OpenAPI Specification ([#229](#229)). * Replace `datatime.timedelta()` with `datetime.timedelta()` in codebase ([#207](#207)). * Support dod in python sdk ([#212](#212)). * Use .ConstantName defining target enum states for waiters ([#230](#230)). * [DECO-1115] Add local implementation for `dbutils.widgets` ([#93](#93)). API Changes: * Added `update()` method for [w.tables](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/tables.html) workspace-level service. * Added [w.lineage_tracking](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/lineage_tracking.html) workspace-level service. * Removed `comment` field for `databricks.sdk.service.catalog.ColumnInfo`. * Removed `mask` field for `databricks.sdk.service.catalog.ColumnInfo`. * Removed `nullable` field for `databricks.sdk.service.catalog.ColumnInfo`. * Removed `partition_index` field for `databricks.sdk.service.catalog.ColumnInfo`. * Removed `position` field for `databricks.sdk.service.catalog.ColumnInfo`. * Removed `type_interval_type` field for `databricks.sdk.service.catalog.ColumnInfo`. * Removed `type_json` field for `databricks.sdk.service.catalog.ColumnInfo`. * Removed `type_name` field for `databricks.sdk.service.catalog.ColumnInfo`. * Removed `type_precision` field for `databricks.sdk.service.catalog.ColumnInfo`. * Removed `type_scale` field for `databricks.sdk.service.catalog.ColumnInfo`. * Removed `type_text` field for `databricks.sdk.service.catalog.ColumnInfo`. * Added `catalog_name` field for `databricks.sdk.service.catalog.ColumnInfo`. * Added `has_permission` field for `databricks.sdk.service.catalog.ColumnInfo`. * Added `path` field for `databricks.sdk.service.catalog.ColumnInfo`. * Added `schema_name` field for `databricks.sdk.service.catalog.ColumnInfo`. * Added `table_name` field for `databricks.sdk.service.catalog.ColumnInfo`. * Added `table_type` field for `databricks.sdk.service.catalog.ColumnInfo`. * Removed `databricks.sdk.service.catalog.ColumnMask` dataclass. * Removed `databricks.sdk.service.catalog.DataSourceFormat` dataclass. * Removed `databricks.sdk.service.catalog.DeltaRuntimePropertiesKvPairs` dataclass. * Removed `databricks.sdk.service.catalog.TableConstraintList` dataclass. * Removed `columns` field for `databricks.sdk.service.catalog.TableInfo`. * Removed `comment` field for `databricks.sdk.service.catalog.TableInfo`. * Removed `created_at` field for `databricks.sdk.service.catalog.TableInfo`. * Removed `created_by` field for `databricks.sdk.service.catalog.TableInfo`. * Removed `data_access_configuration_id` field for `databricks.sdk.service.catalog.TableInfo`. * Removed `data_source_format` field for `databricks.sdk.service.catalog.TableInfo`. * Removed `deleted_at` field for `databricks.sdk.service.catalog.TableInfo`. * Removed `delta_runtime_properties_kvpairs` field for `databricks.sdk.service.catalog.TableInfo`. * Removed `effective_auto_maintenance_flag` field for `databricks.sdk.service.catalog.TableInfo`. * Removed `enable_auto_maintenance` field for `databricks.sdk.service.catalog.TableInfo`. * Removed `full_name` field for `databricks.sdk.service.catalog.TableInfo`. * Removed `metastore_id` field for `databricks.sdk.service.catalog.TableInfo`. * Removed `owner` field for `databricks.sdk.service.catalog.TableInfo`. * Removed `properties` field for `databricks.sdk.service.catalog.TableInfo`. * Removed `row_filter` field for `databricks.sdk.service.catalog.TableInfo`. * Removed `sql_path` field for `databricks.sdk.service.catalog.TableInfo`. * Removed `storage_credential_name` field for `databricks.sdk.service.catalog.TableInfo`. * Removed `storage_location` field for `databricks.sdk.service.catalog.TableInfo`. * Removed `table_constraints` field for `databricks.sdk.service.catalog.TableInfo`. * Removed `table_id` field for `databricks.sdk.service.catalog.TableInfo`. * Removed `updated_at` field for `databricks.sdk.service.catalog.TableInfo`. * Removed `updated_by` field for `databricks.sdk.service.catalog.TableInfo`. * Removed `view_definition` field for `databricks.sdk.service.catalog.TableInfo`. * Removed `view_dependencies` field for `databricks.sdk.service.catalog.TableInfo`. * Added `has_permission` field for `databricks.sdk.service.catalog.TableInfo`. * Removed `databricks.sdk.service.catalog.TableRowFilter` dataclass. * Added `databricks.sdk.service.catalog.ColumnLineageRequest` dataclass. * Added `databricks.sdk.service.catalog.DashboardInfo` dataclass. * Added `databricks.sdk.service.catalog.FileInfo` dataclass. * Added `databricks.sdk.service.catalog.GetColumnLineagesResponse` dataclass. * Added `databricks.sdk.service.catalog.GetTableEntityLineageResponse` dataclass. * Added `databricks.sdk.service.catalog.JobInfo` dataclass. * Added `databricks.sdk.service.catalog.LineageInfo` dataclass. * Added `databricks.sdk.service.catalog.NotebookInfo` dataclass. * Added `databricks.sdk.service.catalog.PipelineInfo` dataclass. * Added `databricks.sdk.service.catalog.QueryInfo` dataclass. * Added `databricks.sdk.service.catalog.TableLineageRequest` dataclass. * Added `databricks.sdk.service.catalog.UpdateTableRequest` dataclass. * Added `schema` field for `databricks.sdk.service.iam.PartialUpdate`. * Added `databricks.sdk.service.iam.PatchSchema` dataclass. * Added `trigger_info` field for `databricks.sdk.service.jobs.BaseRun`. * Added `health` field for `databricks.sdk.service.jobs.CreateJob`. * Added `job_source` field for `databricks.sdk.service.jobs.GitSource`. * Added `on_duration_warning_threshold_exceeded` field for `databricks.sdk.service.jobs.JobEmailNotifications`. * Added `health` field for `databricks.sdk.service.jobs.JobSettings`. * Added `trigger_info` field for `databricks.sdk.service.jobs.Run`. * Added `run_job_output` field for `databricks.sdk.service.jobs.RunOutput`. * Added `run_job_task` field for `databricks.sdk.service.jobs.RunTask`. * Added `email_notifications` field for `databricks.sdk.service.jobs.SubmitRun`. * Added `health` field for `databricks.sdk.service.jobs.SubmitRun`. * Added `email_notifications` field for `databricks.sdk.service.jobs.SubmitTask`. * Added `health` field for `databricks.sdk.service.jobs.SubmitTask`. * Added `notification_settings` field for `databricks.sdk.service.jobs.SubmitTask`. * Added `health` field for `databricks.sdk.service.jobs.Task`. * Added `run_job_task` field for `databricks.sdk.service.jobs.Task`. * Added `on_duration_warning_threshold_exceeded` field for `databricks.sdk.service.jobs.TaskEmailNotifications`. * Added `on_duration_warning_threshold_exceeded` field for `databricks.sdk.service.jobs.WebhookNotifications`. * Added `databricks.sdk.service.jobs.JobSource` dataclass. * Added `databricks.sdk.service.jobs.JobSourceDirtyState` dataclass. * Added `databricks.sdk.service.jobs.JobsHealthMetric` dataclass. * Added `databricks.sdk.service.jobs.JobsHealthOperator` dataclass. * Added `databricks.sdk.service.jobs.JobsHealthRule` dataclass. * Added `databricks.sdk.service.jobs.JobsHealthRules` dataclass. * Added `databricks.sdk.service.jobs.RunJobOutput` dataclass. * Added `databricks.sdk.service.jobs.RunJobTask` dataclass. * Added `databricks.sdk.service.jobs.TriggerInfo` dataclass. * Added `databricks.sdk.service.jobs.WebhookNotificationsOnDurationWarningThresholdExceededItem` dataclass. * Removed `whl` field for `databricks.sdk.service.pipelines.PipelineLibrary`. * Changed `delete_personal_compute_setting()` method for [a.account_settings](https://databricks-sdk-py.readthedocs.io/en/latest/account/account_settings.html) account-level service with new required argument order. * Changed `read_personal_compute_setting()` method for [a.account_settings](https://databricks-sdk-py.readthedocs.io/en/latest/account/account_settings.html) account-level service with new required argument order. * Changed `etag` field for `databricks.sdk.service.settings.DeletePersonalComputeSettingRequest` to be required. * Changed `etag` field for `databricks.sdk.service.settings.ReadPersonalComputeSettingRequest` to be required. * Added [w.clean_rooms](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/clean_rooms.html) workspace-level service. * Added `databricks.sdk.service.sharing.CentralCleanRoomInfo` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomAssetInfo` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomCatalog` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomCatalogUpdate` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomCollaboratorInfo` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomInfo` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomNotebookInfo` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomTableInfo` dataclass. * Added `databricks.sdk.service.sharing.ColumnInfo` dataclass. * Added `databricks.sdk.service.sharing.ColumnMask` dataclass. * Added `databricks.sdk.service.sharing.ColumnTypeName` dataclass. * Added `databricks.sdk.service.sharing.CreateCleanRoom` dataclass. * Added `databricks.sdk.service.sharing.DeleteCleanRoomRequest` dataclass. * Added `databricks.sdk.service.sharing.GetCleanRoomRequest` dataclass. * Added `databricks.sdk.service.sharing.ListCleanRoomsResponse` dataclass. * Added `databricks.sdk.service.sharing.UpdateCleanRoom` dataclass. * Changed `query` field for `databricks.sdk.service.sql.Alert` to `databricks.sdk.service.sql.AlertQuery` dataclass. * Changed `value` field for `databricks.sdk.service.sql.AlertOptions` to `any` dataclass. * Removed `is_db_admin` field for `databricks.sdk.service.sql.User`. * Removed `profile_image_url` field for `databricks.sdk.service.sql.User`. * Added `databricks.sdk.service.sql.AlertQuery` dataclass. OpenAPI SHA: 1f35e7f31d95b7dd0b08df53a22ec9dac18b1c84, Date: 2023-07-15
* Add Issue Templates ([#208](#208)). * Added toolchain configuration to `.codegen.json` ([#236](#236)). * Fix enum deserialization ([#234](#234)). * Fix enum deserialization, take 2 ([#235](#235)). * Fix error message, ExportFormat -> ImportFormat ([#220](#220)). * Fixed notebook native auth for jobs ([#209](#209)). * Make OpenAPI spec location configurable ([#237](#237)). * Make workspace client also return runtime dbutils when in dbr ([#210](#210)). * Regenerate Python SDK using recent OpenAPI Specification ([#229](#229)). * Replace `datatime.timedelta()` with `datetime.timedelta()` in codebase ([#207](#207)). * Support dod in python sdk ([#212](#212)). * Use .ConstantName defining target enum states for waiters ([#230](#230)). * [DECO-1115] Add local implementation for `dbutils.widgets` ([#93](#93)). API Changes: * Added `update()` method for [w.tables](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/tables.html) workspace-level service. * Added `databricks.sdk.service.catalog.UpdateTableRequest` dataclass. * Added `schema` field for `databricks.sdk.service.iam.PartialUpdate`. * Added `databricks.sdk.service.iam.PatchSchema` dataclass. * Added `trigger_info` field for `databricks.sdk.service.jobs.BaseRun`. * Added `health` field for `databricks.sdk.service.jobs.CreateJob`. * Added `job_source` field for `databricks.sdk.service.jobs.GitSource`. * Added `on_duration_warning_threshold_exceeded` field for `databricks.sdk.service.jobs.JobEmailNotifications`. * Added `health` field for `databricks.sdk.service.jobs.JobSettings`. * Added `trigger_info` field for `databricks.sdk.service.jobs.Run`. * Added `run_job_output` field for `databricks.sdk.service.jobs.RunOutput`. * Added `run_job_task` field for `databricks.sdk.service.jobs.RunTask`. * Added `email_notifications` field for `databricks.sdk.service.jobs.SubmitRun`. * Added `health` field for `databricks.sdk.service.jobs.SubmitRun`. * Added `email_notifications` field for `databricks.sdk.service.jobs.SubmitTask`. * Added `health` field for `databricks.sdk.service.jobs.SubmitTask`. * Added `notification_settings` field for `databricks.sdk.service.jobs.SubmitTask`. * Added `health` field for `databricks.sdk.service.jobs.Task`. * Added `run_job_task` field for `databricks.sdk.service.jobs.Task`. * Added `on_duration_warning_threshold_exceeded` field for `databricks.sdk.service.jobs.TaskEmailNotifications`. * Added `on_duration_warning_threshold_exceeded` field for `databricks.sdk.service.jobs.WebhookNotifications`. * Added `databricks.sdk.service.jobs.JobSource` dataclass. * Added `databricks.sdk.service.jobs.JobSourceDirtyState` dataclass. * Added `databricks.sdk.service.jobs.JobsHealthMetric` dataclass. * Added `databricks.sdk.service.jobs.JobsHealthOperator` dataclass. * Added `databricks.sdk.service.jobs.JobsHealthRule` dataclass. * Added `databricks.sdk.service.jobs.JobsHealthRules` dataclass. * Added `databricks.sdk.service.jobs.RunJobOutput` dataclass. * Added `databricks.sdk.service.jobs.RunJobTask` dataclass. * Added `databricks.sdk.service.jobs.TriggerInfo` dataclass. * Added `databricks.sdk.service.jobs.WebhookNotificationsOnDurationWarningThresholdExceededItem` dataclass. * Removed `whl` field for `databricks.sdk.service.pipelines.PipelineLibrary`. * Changed `delete_personal_compute_setting()` method for [a.account_settings](https://databricks-sdk-py.readthedocs.io/en/latest/account/account_settings.html) account-level service with new required argument order. * Changed `read_personal_compute_setting()` method for [a.account_settings](https://databricks-sdk-py.readthedocs.io/en/latest/account/account_settings.html) account-level service with new required argument order. * Changed `etag` field for `databricks.sdk.service.settings.DeletePersonalComputeSettingRequest` to be required. * Changed `etag` field for `databricks.sdk.service.settings.ReadPersonalComputeSettingRequest` to be required. * Added [w.clean_rooms](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/clean_rooms.html) workspace-level service. * Added `databricks.sdk.service.sharing.CentralCleanRoomInfo` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomAssetInfo` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomCatalog` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomCatalogUpdate` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomCollaboratorInfo` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomInfo` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomNotebookInfo` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomTableInfo` dataclass. * Added `databricks.sdk.service.sharing.ColumnInfo` dataclass. * Added `databricks.sdk.service.sharing.ColumnMask` dataclass. * Added `databricks.sdk.service.sharing.ColumnTypeName` dataclass. * Added `databricks.sdk.service.sharing.CreateCleanRoom` dataclass. * Added `databricks.sdk.service.sharing.DeleteCleanRoomRequest` dataclass. * Added `databricks.sdk.service.sharing.GetCleanRoomRequest` dataclass. * Added `databricks.sdk.service.sharing.ListCleanRoomsResponse` dataclass. * Added `databricks.sdk.service.sharing.UpdateCleanRoom` dataclass. * Changed `query` field for `databricks.sdk.service.sql.Alert` to `databricks.sdk.service.sql.AlertQuery` dataclass. * Changed `value` field for `databricks.sdk.service.sql.AlertOptions` to `any` dataclass. * Removed `is_db_admin` field for `databricks.sdk.service.sql.User`. * Removed `profile_image_url` field for `databricks.sdk.service.sql.User`. * Added `databricks.sdk.service.sql.AlertQuery` dataclass. OpenAPI SHA: e20d2b10a181b1e865716de25f42e86d7e3f0270, Date: 2023-07-17
* Add Issue Templates ([#208](#208)). * Added toolchain configuration to `.codegen.json` ([#236](#236)). * Fix enum deserialization ([#234](#234)). * Fix enum deserialization, take 2 ([#235](#235)). * Fix error message, ExportFormat -> ImportFormat ([#220](#220)). * Fixed notebook native auth for jobs ([#209](#209)). * Make OpenAPI spec location configurable ([#237](#237)). * Make workspace client also return runtime dbutils when in dbr ([#210](#210)). * Regenerate Python SDK using recent OpenAPI Specification ([#229](#229)). * Replace `datatime.timedelta()` with `datetime.timedelta()` in codebase ([#207](#207)). * Support dod in python sdk ([#212](#212)). * Use .ConstantName defining target enum states for waiters ([#230](#230)). * [DECO-1115] Add local implementation for `dbutils.widgets` ([#93](#93)). API Changes: * Added `update()` method for [w.tables](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/tables.html) workspace-level service. * Added `databricks.sdk.service.catalog.UpdateTableRequest` dataclass. * Added `schema` field for `databricks.sdk.service.iam.PartialUpdate`. * Added `databricks.sdk.service.iam.PatchSchema` dataclass. * Added `trigger_info` field for `databricks.sdk.service.jobs.BaseRun`. * Added `health` field for `databricks.sdk.service.jobs.CreateJob`. * Added `job_source` field for `databricks.sdk.service.jobs.GitSource`. * Added `on_duration_warning_threshold_exceeded` field for `databricks.sdk.service.jobs.JobEmailNotifications`. * Added `health` field for `databricks.sdk.service.jobs.JobSettings`. * Added `trigger_info` field for `databricks.sdk.service.jobs.Run`. * Added `run_job_output` field for `databricks.sdk.service.jobs.RunOutput`. * Added `run_job_task` field for `databricks.sdk.service.jobs.RunTask`. * Added `email_notifications` field for `databricks.sdk.service.jobs.SubmitRun`. * Added `health` field for `databricks.sdk.service.jobs.SubmitRun`. * Added `email_notifications` field for `databricks.sdk.service.jobs.SubmitTask`. * Added `health` field for `databricks.sdk.service.jobs.SubmitTask`. * Added `notification_settings` field for `databricks.sdk.service.jobs.SubmitTask`. * Added `health` field for `databricks.sdk.service.jobs.Task`. * Added `run_job_task` field for `databricks.sdk.service.jobs.Task`. * Added `on_duration_warning_threshold_exceeded` field for `databricks.sdk.service.jobs.TaskEmailNotifications`. * Added `on_duration_warning_threshold_exceeded` field for `databricks.sdk.service.jobs.WebhookNotifications`. * Added `databricks.sdk.service.jobs.JobSource` dataclass. * Added `databricks.sdk.service.jobs.JobSourceDirtyState` dataclass. * Added `databricks.sdk.service.jobs.JobsHealthMetric` dataclass. * Added `databricks.sdk.service.jobs.JobsHealthOperator` dataclass. * Added `databricks.sdk.service.jobs.JobsHealthRule` dataclass. * Added `databricks.sdk.service.jobs.JobsHealthRules` dataclass. * Added `databricks.sdk.service.jobs.RunJobOutput` dataclass. * Added `databricks.sdk.service.jobs.RunJobTask` dataclass. * Added `databricks.sdk.service.jobs.TriggerInfo` dataclass. * Added `databricks.sdk.service.jobs.WebhookNotificationsOnDurationWarningThresholdExceededItem` dataclass. * Removed `whl` field for `databricks.sdk.service.pipelines.PipelineLibrary`. * Changed `delete_personal_compute_setting()` method for [a.account_settings](https://databricks-sdk-py.readthedocs.io/en/latest/account/account_settings.html) account-level service with new required argument order. * Changed `read_personal_compute_setting()` method for [a.account_settings](https://databricks-sdk-py.readthedocs.io/en/latest/account/account_settings.html) account-level service with new required argument order. * Changed `etag` field for `databricks.sdk.service.settings.DeletePersonalComputeSettingRequest` to be required. * Changed `etag` field for `databricks.sdk.service.settings.ReadPersonalComputeSettingRequest` to be required. * Added [w.clean_rooms](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/clean_rooms.html) workspace-level service. * Added `databricks.sdk.service.sharing.CentralCleanRoomInfo` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomAssetInfo` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomCatalog` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomCatalogUpdate` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomCollaboratorInfo` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomInfo` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomNotebookInfo` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomTableInfo` dataclass. * Added `databricks.sdk.service.sharing.ColumnInfo` dataclass. * Added `databricks.sdk.service.sharing.ColumnMask` dataclass. * Added `databricks.sdk.service.sharing.ColumnTypeName` dataclass. * Added `databricks.sdk.service.sharing.CreateCleanRoom` dataclass. * Added `databricks.sdk.service.sharing.DeleteCleanRoomRequest` dataclass. * Added `databricks.sdk.service.sharing.GetCleanRoomRequest` dataclass. * Added `databricks.sdk.service.sharing.ListCleanRoomsResponse` dataclass. * Added `databricks.sdk.service.sharing.UpdateCleanRoom` dataclass. * Changed `query` field for `databricks.sdk.service.sql.Alert` to `databricks.sdk.service.sql.AlertQuery` dataclass. * Changed `value` field for `databricks.sdk.service.sql.AlertOptions` to `any` dataclass. * Removed `is_db_admin` field for `databricks.sdk.service.sql.User`. * Removed `profile_image_url` field for `databricks.sdk.service.sql.User`. * Added `databricks.sdk.service.sql.AlertQuery` dataclass. OpenAPI SHA: e20d2b10a181b1e865716de25f42e86d7e3f0270, Date: 2023-07-17
* Add Issue Templates ([#208](#208)). * Fixed notebook native auth for jobs ([#209](#209)). * Replace `datatime.timedelta()` with `datetime.timedelta()` in codebase ([#207](#207)). * Support dod in python sdk ([#212](#212)). * [DECO-1115] Add local implementation for `dbutils.widgets` ([#93](#93)). * Fix error message, ExportFormat -> ImportFormat ([#220](#220)). * Regenerate Python SDK using recent OpenAPI Specification ([#229](#229)). * Make workspace client also return runtime dbutils when in dbr ([#210](#210)). * Use .ConstantName defining target enum states for waiters ([#230](#230)). * Fix enum deserialization ([#234](#234)). * Fix enum deserialization, take 2 ([#235](#235)). * Added toolchain configuration to `.codegen.json` ([#236](#236)). * Make OpenAPI spec location configurable ([#237](#237)). API Changes: * Added `update()` method for [w.tables](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/tables.html) workspace-level service. * Added `databricks.sdk.service.catalog.UpdateTableRequest` dataclass. * Added `schema` field for `databricks.sdk.service.iam.PartialUpdate`. * Added `databricks.sdk.service.iam.PatchSchema` dataclass. * Added `trigger_info` field for `databricks.sdk.service.jobs.BaseRun`. * Added `health` field for `databricks.sdk.service.jobs.CreateJob`. * Added `job_source` field for `databricks.sdk.service.jobs.GitSource`. * Added `on_duration_warning_threshold_exceeded` field for `databricks.sdk.service.jobs.JobEmailNotifications`. * Added `health` field for `databricks.sdk.service.jobs.JobSettings`. * Added `trigger_info` field for `databricks.sdk.service.jobs.Run`. * Added `run_job_output` field for `databricks.sdk.service.jobs.RunOutput`. * Added `run_job_task` field for `databricks.sdk.service.jobs.RunTask`. * Added `email_notifications` field for `databricks.sdk.service.jobs.SubmitRun`. * Added `health` field for `databricks.sdk.service.jobs.SubmitRun`. * Added `email_notifications` field for `databricks.sdk.service.jobs.SubmitTask`. * Added `health` field for `databricks.sdk.service.jobs.SubmitTask`. * Added `notification_settings` field for `databricks.sdk.service.jobs.SubmitTask`. * Added `health` field for `databricks.sdk.service.jobs.Task`. * Added `run_job_task` field for `databricks.sdk.service.jobs.Task`. * Added `on_duration_warning_threshold_exceeded` field for `databricks.sdk.service.jobs.TaskEmailNotifications`. * Added `on_duration_warning_threshold_exceeded` field for `databricks.sdk.service.jobs.WebhookNotifications`. * Added `databricks.sdk.service.jobs.JobSource` dataclass. * Added `databricks.sdk.service.jobs.JobSourceDirtyState` dataclass. * Added `databricks.sdk.service.jobs.JobsHealthMetric` dataclass. * Added `databricks.sdk.service.jobs.JobsHealthOperator` dataclass. * Added `databricks.sdk.service.jobs.JobsHealthRule` dataclass. * Added `databricks.sdk.service.jobs.JobsHealthRules` dataclass. * Added `databricks.sdk.service.jobs.RunJobOutput` dataclass. * Added `databricks.sdk.service.jobs.RunJobTask` dataclass. * Added `databricks.sdk.service.jobs.TriggerInfo` dataclass. * Added `databricks.sdk.service.jobs.WebhookNotificationsOnDurationWarningThresholdExceededItem` dataclass. * Removed `whl` field for `databricks.sdk.service.pipelines.PipelineLibrary`. * Changed `delete_personal_compute_setting()` method for [a.account_settings](https://databricks-sdk-py.readthedocs.io/en/latest/account/account_settings.html) account-level service with new required argument order. * Changed `read_personal_compute_setting()` method for [a.account_settings](https://databricks-sdk-py.readthedocs.io/en/latest/account/account_settings.html) account-level service with new required argument order. * Changed `etag` field for `databricks.sdk.service.settings.DeletePersonalComputeSettingRequest` to be required. * Changed `etag` field for `databricks.sdk.service.settings.ReadPersonalComputeSettingRequest` to be required. * Added [w.clean_rooms](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/clean_rooms.html) workspace-level service. * Added `databricks.sdk.service.sharing.CentralCleanRoomInfo` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomAssetInfo` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomCatalog` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomCatalogUpdate` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomCollaboratorInfo` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomInfo` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomNotebookInfo` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomTableInfo` dataclass. * Added `databricks.sdk.service.sharing.ColumnInfo` dataclass. * Added `databricks.sdk.service.sharing.ColumnMask` dataclass. * Added `databricks.sdk.service.sharing.ColumnTypeName` dataclass. * Added `databricks.sdk.service.sharing.CreateCleanRoom` dataclass. * Added `databricks.sdk.service.sharing.DeleteCleanRoomRequest` dataclass. * Added `databricks.sdk.service.sharing.GetCleanRoomRequest` dataclass. * Added `databricks.sdk.service.sharing.ListCleanRoomsResponse` dataclass. * Added `databricks.sdk.service.sharing.UpdateCleanRoom` dataclass. * Changed `query` field for `databricks.sdk.service.sql.Alert` to `databricks.sdk.service.sql.AlertQuery` dataclass. * Changed `value` field for `databricks.sdk.service.sql.AlertOptions` to `any` dataclass. * Removed `is_db_admin` field for `databricks.sdk.service.sql.User`. * Removed `profile_image_url` field for `databricks.sdk.service.sql.User`. * Added `databricks.sdk.service.sql.AlertQuery` dataclass. OpenAPI SHA: e20d2b10a181b1e865716de25f42e86d7e3f0270, Date: 2023-07-17
* Add Issue Templates ([#208](#208)). * Fixed notebook native auth for jobs ([#209](#209)). * Replace `datatime.timedelta()` with `datetime.timedelta()` in codebase ([#207](#207)). * Support dod in python sdk ([#212](#212)). * [DECO-1115] Add local implementation for `dbutils.widgets` ([#93](#93)). * Fix error message, ExportFormat -> ImportFormat ([#220](#220)). * Regenerate Python SDK using recent OpenAPI Specification ([#229](#229)). * Make workspace client also return runtime dbutils when in dbr ([#210](#210)). * Use .ConstantName defining target enum states for waiters ([#230](#230)). * Fix enum deserialization ([#234](#234)). * Fix enum deserialization, take 2 ([#235](#235)). * Added toolchain configuration to `.codegen.json` ([#236](#236)). * Make OpenAPI spec location configurable ([#237](#237)). API Changes: * Added `update()` method for [w.tables](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/tables.html) workspace-level service. * Added `databricks.sdk.service.catalog.UpdateTableRequest` dataclass. * Added `schema` field for `databricks.sdk.service.iam.PartialUpdate`. * Added `databricks.sdk.service.iam.PatchSchema` dataclass. * Added `trigger_info` field for `databricks.sdk.service.jobs.BaseRun`. * Added `health` field for `databricks.sdk.service.jobs.CreateJob`. * Added `job_source` field for `databricks.sdk.service.jobs.GitSource`. * Added `on_duration_warning_threshold_exceeded` field for `databricks.sdk.service.jobs.JobEmailNotifications`. * Added `health` field for `databricks.sdk.service.jobs.JobSettings`. * Added `trigger_info` field for `databricks.sdk.service.jobs.Run`. * Added `run_job_output` field for `databricks.sdk.service.jobs.RunOutput`. * Added `run_job_task` field for `databricks.sdk.service.jobs.RunTask`. * Added `email_notifications` field for `databricks.sdk.service.jobs.SubmitRun`. * Added `health` field for `databricks.sdk.service.jobs.SubmitRun`. * Added `email_notifications` field for `databricks.sdk.service.jobs.SubmitTask`. * Added `health` field for `databricks.sdk.service.jobs.SubmitTask`. * Added `notification_settings` field for `databricks.sdk.service.jobs.SubmitTask`. * Added `health` field for `databricks.sdk.service.jobs.Task`. * Added `run_job_task` field for `databricks.sdk.service.jobs.Task`. * Added `on_duration_warning_threshold_exceeded` field for `databricks.sdk.service.jobs.TaskEmailNotifications`. * Added `on_duration_warning_threshold_exceeded` field for `databricks.sdk.service.jobs.WebhookNotifications`. * Added `databricks.sdk.service.jobs.JobSource` dataclass. * Added `databricks.sdk.service.jobs.JobSourceDirtyState` dataclass. * Added `databricks.sdk.service.jobs.JobsHealthMetric` dataclass. * Added `databricks.sdk.service.jobs.JobsHealthOperator` dataclass. * Added `databricks.sdk.service.jobs.JobsHealthRule` dataclass. * Added `databricks.sdk.service.jobs.JobsHealthRules` dataclass. * Added `databricks.sdk.service.jobs.RunJobOutput` dataclass. * Added `databricks.sdk.service.jobs.RunJobTask` dataclass. * Added `databricks.sdk.service.jobs.TriggerInfo` dataclass. * Added `databricks.sdk.service.jobs.WebhookNotificationsOnDurationWarningThresholdExceededItem` dataclass. * Removed `whl` field for `databricks.sdk.service.pipelines.PipelineLibrary`. * Changed `delete_personal_compute_setting()` method for [a.account_settings](https://databricks-sdk-py.readthedocs.io/en/latest/account/account_settings.html) account-level service with new required argument order. * Changed `read_personal_compute_setting()` method for [a.account_settings](https://databricks-sdk-py.readthedocs.io/en/latest/account/account_settings.html) account-level service with new required argument order. * Changed `etag` field for `databricks.sdk.service.settings.DeletePersonalComputeSettingRequest` to be required. * Changed `etag` field for `databricks.sdk.service.settings.ReadPersonalComputeSettingRequest` to be required. * Added [w.clean_rooms](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/clean_rooms.html) workspace-level service. * Added `databricks.sdk.service.sharing.CentralCleanRoomInfo` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomAssetInfo` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomCatalog` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomCatalogUpdate` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomCollaboratorInfo` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomInfo` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomNotebookInfo` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomTableInfo` dataclass. * Added `databricks.sdk.service.sharing.ColumnInfo` dataclass. * Added `databricks.sdk.service.sharing.ColumnMask` dataclass. * Added `databricks.sdk.service.sharing.ColumnTypeName` dataclass. * Added `databricks.sdk.service.sharing.CreateCleanRoom` dataclass. * Added `databricks.sdk.service.sharing.DeleteCleanRoomRequest` dataclass. * Added `databricks.sdk.service.sharing.GetCleanRoomRequest` dataclass. * Added `databricks.sdk.service.sharing.ListCleanRoomsResponse` dataclass. * Added `databricks.sdk.service.sharing.UpdateCleanRoom` dataclass. * Changed `query` field for `databricks.sdk.service.sql.Alert` to `databricks.sdk.service.sql.AlertQuery` dataclass. * Changed `value` field for `databricks.sdk.service.sql.AlertOptions` to `any` dataclass. * Removed `is_db_admin` field for `databricks.sdk.service.sql.User`. * Removed `profile_image_url` field for `databricks.sdk.service.sql.User`. * Added `databricks.sdk.service.sql.AlertQuery` dataclass. OpenAPI SHA: e20d2b10a181b1e865716de25f42e86d7e3f0270, Date: 2023-07-17
* Add Issue Templates ([#208](#208)). * Fixed notebook native auth for jobs ([#209](#209)). * Replace `datatime.timedelta()` with `datetime.timedelta()` in codebase ([#207](#207)). * Support dod in python sdk ([#212](#212)). * [DECO-1115] Add local implementation for `dbutils.widgets` ([#93](#93)). * Fix error message, ExportFormat -> ImportFormat ([#220](#220)). * Regenerate Python SDK using recent OpenAPI Specification ([#229](#229)). * Make workspace client also return runtime dbutils when in dbr ([#210](#210)). * Use .ConstantName defining target enum states for waiters ([#230](#230)). * Fix enum deserialization ([#234](#234)). * Fix enum deserialization, take 2 ([#235](#235)). * Added toolchain configuration to `.codegen.json` ([#236](#236)). * Make OpenAPI spec location configurable ([#237](#237)). API Changes: * Added `update()` method for [w.tables](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/tables.html) workspace-level service. * Added `databricks.sdk.service.catalog.UpdateTableRequest` dataclass. * Added `schema` field for `databricks.sdk.service.iam.PartialUpdate`. * Added `databricks.sdk.service.iam.PatchSchema` dataclass. * Added `trigger_info` field for `databricks.sdk.service.jobs.BaseRun`. * Added `health` field for `databricks.sdk.service.jobs.CreateJob`. * Added `job_source` field for `databricks.sdk.service.jobs.GitSource`. * Added `on_duration_warning_threshold_exceeded` field for `databricks.sdk.service.jobs.JobEmailNotifications`. * Added `health` field for `databricks.sdk.service.jobs.JobSettings`. * Added `trigger_info` field for `databricks.sdk.service.jobs.Run`. * Added `run_job_output` field for `databricks.sdk.service.jobs.RunOutput`. * Added `run_job_task` field for `databricks.sdk.service.jobs.RunTask`. * Added `email_notifications` field for `databricks.sdk.service.jobs.SubmitRun`. * Added `health` field for `databricks.sdk.service.jobs.SubmitRun`. * Added `email_notifications` field for `databricks.sdk.service.jobs.SubmitTask`. * Added `health` field for `databricks.sdk.service.jobs.SubmitTask`. * Added `notification_settings` field for `databricks.sdk.service.jobs.SubmitTask`. * Added `health` field for `databricks.sdk.service.jobs.Task`. * Added `run_job_task` field for `databricks.sdk.service.jobs.Task`. * Added `on_duration_warning_threshold_exceeded` field for `databricks.sdk.service.jobs.TaskEmailNotifications`. * Added `on_duration_warning_threshold_exceeded` field for `databricks.sdk.service.jobs.WebhookNotifications`. * Added `databricks.sdk.service.jobs.JobSource` dataclass. * Added `databricks.sdk.service.jobs.JobSourceDirtyState` dataclass. * Added `databricks.sdk.service.jobs.JobsHealthMetric` dataclass. * Added `databricks.sdk.service.jobs.JobsHealthOperator` dataclass. * Added `databricks.sdk.service.jobs.JobsHealthRule` dataclass. * Added `databricks.sdk.service.jobs.JobsHealthRules` dataclass. * Added `databricks.sdk.service.jobs.RunJobOutput` dataclass. * Added `databricks.sdk.service.jobs.RunJobTask` dataclass. * Added `databricks.sdk.service.jobs.TriggerInfo` dataclass. * Added `databricks.sdk.service.jobs.WebhookNotificationsOnDurationWarningThresholdExceededItem` dataclass. * Removed `whl` field for `databricks.sdk.service.pipelines.PipelineLibrary`. * Changed `delete_personal_compute_setting()` method for [a.account_settings](https://databricks-sdk-py.readthedocs.io/en/latest/account/account_settings.html) account-level service with new required argument order. * Changed `read_personal_compute_setting()` method for [a.account_settings](https://databricks-sdk-py.readthedocs.io/en/latest/account/account_settings.html) account-level service with new required argument order. * Changed `etag` field for `databricks.sdk.service.settings.DeletePersonalComputeSettingRequest` to be required. * Changed `etag` field for `databricks.sdk.service.settings.ReadPersonalComputeSettingRequest` to be required. * Added [w.clean_rooms](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/clean_rooms.html) workspace-level service. * Added `databricks.sdk.service.sharing.CentralCleanRoomInfo` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomAssetInfo` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomCatalog` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomCatalogUpdate` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomCollaboratorInfo` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomInfo` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomNotebookInfo` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomTableInfo` dataclass. * Added `databricks.sdk.service.sharing.ColumnInfo` dataclass. * Added `databricks.sdk.service.sharing.ColumnMask` dataclass. * Added `databricks.sdk.service.sharing.ColumnTypeName` dataclass. * Added `databricks.sdk.service.sharing.CreateCleanRoom` dataclass. * Added `databricks.sdk.service.sharing.DeleteCleanRoomRequest` dataclass. * Added `databricks.sdk.service.sharing.GetCleanRoomRequest` dataclass. * Added `databricks.sdk.service.sharing.ListCleanRoomsResponse` dataclass. * Added `databricks.sdk.service.sharing.UpdateCleanRoom` dataclass. * Changed `query` field for `databricks.sdk.service.sql.Alert` to `databricks.sdk.service.sql.AlertQuery` dataclass. * Changed `value` field for `databricks.sdk.service.sql.AlertOptions` to `any` dataclass. * Removed `is_db_admin` field for `databricks.sdk.service.sql.User`. * Removed `profile_image_url` field for `databricks.sdk.service.sql.User`. * Added `databricks.sdk.service.sql.AlertQuery` dataclass. OpenAPI SHA: e20d2b10a181b1e865716de25f42e86d7e3f0270, Date: 2023-07-17
* Add Issue Templates ([#208](#208)). * Fixed notebook native auth for jobs ([#209](#209)). * Replace `datatime.timedelta()` with `datetime.timedelta()` in codebase ([#207](#207)). * Support dod in python sdk ([#212](#212)). * [DECO-1115] Add local implementation for `dbutils.widgets` ([#93](#93)). * Fix error message, ExportFormat -> ImportFormat ([#220](#220)). * Regenerate Python SDK using recent OpenAPI Specification ([#229](#229)). * Make workspace client also return runtime dbutils when in dbr ([#210](#210)). * Use .ConstantName defining target enum states for waiters ([#230](#230)). * Fix enum deserialization ([#234](#234)). * Fix enum deserialization, take 2 ([#235](#235)). * Added toolchain configuration to `.codegen.json` ([#236](#236)). * Make OpenAPI spec location configurable ([#237](#237)). API Changes: * Added `update()` method for [w.tables](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/tables.html) workspace-level service. * Added `databricks.sdk.service.catalog.UpdateTableRequest` dataclass. * Added `schema` field for `databricks.sdk.service.iam.PartialUpdate`. * Added `databricks.sdk.service.iam.PatchSchema` dataclass. * Added `trigger_info` field for `databricks.sdk.service.jobs.BaseRun`. * Added `health` field for `databricks.sdk.service.jobs.CreateJob`. * Added `job_source` field for `databricks.sdk.service.jobs.GitSource`. * Added `on_duration_warning_threshold_exceeded` field for `databricks.sdk.service.jobs.JobEmailNotifications`. * Added `health` field for `databricks.sdk.service.jobs.JobSettings`. * Added `trigger_info` field for `databricks.sdk.service.jobs.Run`. * Added `run_job_output` field for `databricks.sdk.service.jobs.RunOutput`. * Added `run_job_task` field for `databricks.sdk.service.jobs.RunTask`. * Added `email_notifications` field for `databricks.sdk.service.jobs.SubmitRun`. * Added `health` field for `databricks.sdk.service.jobs.SubmitRun`. * Added `email_notifications` field for `databricks.sdk.service.jobs.SubmitTask`. * Added `health` field for `databricks.sdk.service.jobs.SubmitTask`. * Added `notification_settings` field for `databricks.sdk.service.jobs.SubmitTask`. * Added `health` field for `databricks.sdk.service.jobs.Task`. * Added `run_job_task` field for `databricks.sdk.service.jobs.Task`. * Added `on_duration_warning_threshold_exceeded` field for `databricks.sdk.service.jobs.TaskEmailNotifications`. * Added `on_duration_warning_threshold_exceeded` field for `databricks.sdk.service.jobs.WebhookNotifications`. * Added `databricks.sdk.service.jobs.JobSource` dataclass. * Added `databricks.sdk.service.jobs.JobSourceDirtyState` dataclass. * Added `databricks.sdk.service.jobs.JobsHealthMetric` dataclass. * Added `databricks.sdk.service.jobs.JobsHealthOperator` dataclass. * Added `databricks.sdk.service.jobs.JobsHealthRule` dataclass. * Added `databricks.sdk.service.jobs.JobsHealthRules` dataclass. * Added `databricks.sdk.service.jobs.RunJobOutput` dataclass. * Added `databricks.sdk.service.jobs.RunJobTask` dataclass. * Added `databricks.sdk.service.jobs.TriggerInfo` dataclass. * Added `databricks.sdk.service.jobs.WebhookNotificationsOnDurationWarningThresholdExceededItem` dataclass. * Removed `whl` field for `databricks.sdk.service.pipelines.PipelineLibrary`. * Changed `delete_personal_compute_setting()` method for [a.account_settings](https://databricks-sdk-py.readthedocs.io/en/latest/account/account_settings.html) account-level service with new required argument order. * Changed `read_personal_compute_setting()` method for [a.account_settings](https://databricks-sdk-py.readthedocs.io/en/latest/account/account_settings.html) account-level service with new required argument order. * Changed `etag` field for `databricks.sdk.service.settings.DeletePersonalComputeSettingRequest` to be required. * Changed `etag` field for `databricks.sdk.service.settings.ReadPersonalComputeSettingRequest` to be required. * Added [w.clean_rooms](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/clean_rooms.html) workspace-level service. * Added `databricks.sdk.service.sharing.CentralCleanRoomInfo` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomAssetInfo` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomCatalog` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomCatalogUpdate` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomCollaboratorInfo` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomInfo` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomNotebookInfo` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomTableInfo` dataclass. * Added `databricks.sdk.service.sharing.ColumnInfo` dataclass. * Added `databricks.sdk.service.sharing.ColumnMask` dataclass. * Added `databricks.sdk.service.sharing.ColumnTypeName` dataclass. * Added `databricks.sdk.service.sharing.CreateCleanRoom` dataclass. * Added `databricks.sdk.service.sharing.DeleteCleanRoomRequest` dataclass. * Added `databricks.sdk.service.sharing.GetCleanRoomRequest` dataclass. * Added `databricks.sdk.service.sharing.ListCleanRoomsResponse` dataclass. * Added `databricks.sdk.service.sharing.UpdateCleanRoom` dataclass. * Changed `query` field for `databricks.sdk.service.sql.Alert` to `databricks.sdk.service.sql.AlertQuery` dataclass. * Changed `value` field for `databricks.sdk.service.sql.AlertOptions` to `any` dataclass. * Removed `is_db_admin` field for `databricks.sdk.service.sql.User`. * Removed `profile_image_url` field for `databricks.sdk.service.sql.User`. * Added `databricks.sdk.service.sql.AlertQuery` dataclass. OpenAPI SHA: e20d2b10a181b1e865716de25f42e86d7e3f0270, Date: 2023-07-17
…icks#210) ## Changes * `workspace_client.dbutils` always returned oss implementation of dbutils. We want it to also use dbr implementation when in dbr. ## Tests * [x] Manually Test in dbr * [x] Test locally - [x] `make test` run locally - [x] `make fmt` applied - [ ] relevant integration tests applied
author Kartik Gupta <88345179+kartikgupta-db@users.noreply.github.com> 1688609182 +0200 committer Spece <Michael.Spece@Chubb.Com> 1689629188 -0400 gpgsig -----BEGIN PGP SIGNATURE----- iQGzBAABCAAdFiEEmFkdmlNpeHaohb9psG3XEklX7TkFAmS1sgQACgkQsG3XEklX 7Tl6Cgv9GDR+CQVJBr7SrITDS41HgQLIN/s/o3o/gX7htrYC9LTW95hKSUXmirMO 6+ZfI/rJpgB4nwe7fW47mn+nkXS+RrTO+LVZ3mzhiOunNGyGNix8BpjusjSmIxmB kCfgNLjQ9kLVLN1MRrYD4G4eg5zIkjgdNGMpZ1zxOcxue7Vd+qd5nGusUAiBEZIN ok5bw/VrfEBhYFx8S+XS3+OQpXAECc/tO4jmPghq5LIaP3SnlqS3/8MrYZOAP4kT zvmT+l/ySfAtl49I3tV5gDzB1myu717l2rzyE9JovDg4JIpdB+F3ZkYupgKozw8+ RhdM+4NH+0w+A4NhgGKA+fn80cK2ZW9fNLjSqv5JQa2ppb/LJNs+ZiCKQVmNA/jP dYNI++d3sc+lrNlGHjZPQ8NQBJYxMuOC9OKTLC4vEJQ8Cc2hXUKlAYCqIvTFDoKP a21ynsDe8L+CDAKT6u3jSiUbNUlhl8cck4pFsRbVZYiikuBidtlHHLoXmHjeSayZ N8eiCeeh =15Z+ -----END PGP SIGNATURE----- parent befbb42 author Kartik Gupta <88345179+kartikgupta-db@users.noreply.github.com> 1688609182 +0200 committer Spece <Michael.Spece@Chubb.Com> 1689629116 -0400 gpgsig -----BEGIN PGP SIGNATURE----- iQGzBAABCAAdFiEEmFkdmlNpeHaohb9psG3XEklX7TkFAmS1sbwACgkQsG3XEklX 7TnEJgv+PhwF46QO+N5yEQNdzX8sCi/7pkZePiELGSzjkqEL75wNxrYX22PKaW6V 2ThXk7wczONZYozzadZzB72uZ+jqm5xAtr/QaOZhz8h/xJ79IsuGT+rtA198mCjv k+G/2iZzAb5Jcs08X58YrZJCYPQDPTXmElyRUskMhiO2wjmVgcL80JpHk4UKLBfU 2m3ZJY/ZSBFfBdrHCqHUVfyq8KgC7dcxEApgX4ZNb0eE0wC9PbtrfIzgkrTGuMPE lD0Vp7QIAJozWeO5SJe7HHxSQl3qTSiADjZC2wMQQ4a87eKC6g0hDKt7rTGo75s0 p9UO+MUJHh3/QcHkSIzRYOzkPjLOGIhpnMWAufJNOxRVJdiT0xuDFo2bOZ6JMgRV u2BrGa1ujC+sAgmicuAtBZeC5lf8mKpBgW2/VWFZN7UOMeSxZewgSd9G4xLxGOvt C2htT9yH4zsjHZLAlUqQp8LN4QjCo3lSB/0CBXMNq2FjCI2FKhdjScyJ4hxd++Ol M01P9FU/ =F3cq -----END PGP SIGNATURE----- [DECO-1115] Add local implementation for `dbutils.widgets` (databricks#93) * Added a new install group (`pip install 'databricks-sdk[notebook]'`). This allows us to safely pin ipywidgets for local installs. DBR can safely continue using `pip install databricks-sdk` or directly using the default build from master without conflicting dependencies. * OSS implementation of widgets is imported only on first use (possible only through OSS implementation of dbutils - `RemoteDbutils`). * Add a wrapper for ipywidgets to enable interactive widgets when in interactive **IPython** notebooks. https://user-images.githubusercontent.com/88345179/236443693-1c804107-ba21-4296-ba40-2b1e8e062d16.mov * Add default widgets implementation that returns a default value, when not in an interactive environment. https://user-images.githubusercontent.com/88345179/236443729-51185404-4d28-49c6-ade0-a665e154e092.mov <!-- How is this tested? Please see the checklist below and also describe any other relevant tests --> - [x] `make test` run locally - [x] `make fmt` applied - [ ] relevant integration tests applied Fix error message, ExportFormat -> ImportFormat (databricks#220) The proper argument is ImportFormat.AUTO, not ExportFormat.AUTO Correct the error message when `ImportFormat` is not provided to `workspace.upload`. Signed-off-by: Jessica Smith <8505845+NodeJSmith@users.noreply.github.com> Regenerate Python SDK using recent OpenAPI Specification (databricks#229) Spec commit sha: 17a3f7fe6 (7 July 2023) Breaking Changes: * Use CONSTANT_CASE for Enum constants. Many enums already use constant case in their definition, but some constants (like the SCIM Patch schema name) includes symbols `:` and numbers, so the SDK cannot use the enum value as the name. * Replace Query type with AlertQuery in sql.Alert class. * Removal of User.is_db_admin and User.profile_image_url. Changes: * Introduce CleanRooms API * Introduce TablesAPI.update() * Introduce Group.meta property * Fix SCIM Patch implementation * Introduce BaseRun.job_parameters and BaseRun.trigger_info * Introduce CreateJob.parameters * Fix spelling in file arrival trigger configuration * Introduce GitSource.job_source * Introduce RepairRun.rerun_dependent_tasks * Introduce Resolved*Values classes, RunIf, and RunJobTask * Introduce TaskNotificationSettings Later follow-up: * Names should split on Pascal-case word boundaries (see CloudProviderNodeStatus). This is an OpenAPI code gen change that needs to be made. Make workspace client also return runtime dbutils when in dbr (databricks#210) * `workspace_client.dbutils` always returned oss implementation of dbutils. We want it to also use dbr implementation when in dbr. * [x] Manually Test in dbr * [x] Test locally - [x] `make test` run locally - [x] `make fmt` applied - [ ] relevant integration tests applied Use .ConstantName defining target enum states for waiters (databricks#230) Uses of enums in generated code need to be updated to use `{{.ConstantName}}` instead of `{{.Content}}`. - [ ] `make test` run locally - [ ] `make fmt` applied - [ ] relevant integration tests applied Fix enum deserialization (databricks#234) In databricks#230, enums were changed so that enum field names did not necessarily match the enum value itself. However, the `_enum` helper method used during deserialization of a response containing an enum was not updated to handle this case. This PR corrects this method to check through the values of the `__members__` of an enum, as opposed to the keys. <!-- How is this tested? Please see the checklist below and also describe any other relevant tests --> - [ ] `make test` run locally - [ ] `make fmt` applied - [ ] relevant integration tests applied Fix enum deserialization, take 2 (databricks#235) We jumped the gun too quickly on databricks#234. This is the actual change which fixes the integration tests. - [x] The two failing integration tests (test_submitting_jobs and test_proxy_dbfs_mounts) both pass on this PR. Added toolchain configuration to `.codegen.json` (databricks#236) - Added toolchain config for automated releases - Added `CHANGELOG.md` template with OpenAPI SHA prep release changes Make OpenAPI spec location configurable (databricks#237) Introducing `DATABRICKS_OPENAPI_SPEC` environment variable to hold a filesystem location of `all-internal.json` spec. Rearrange imports in `databricks.sdk.runtime` to improve local editor experience (databricks#219) <!-- Summary of your changes that are easy to understand --> The type hinting here means that VSCode does not give useful syntax highlights / code completions. This is the current experience on `main`: <img width="428" alt="image" src="https://github.com/databricks/databricks-sdk-py/assets/17158624/72d2c3eb-cc3a-4f95-9f09-7d43a8f2815e"> <img width="428" alt="image" src="https://github.com/databricks/databricks-sdk-py/assets/17158624/34b04de0-5996-4a2e-a0d2-b26a8c9d3da9"> With these changes this becomes: <img width="428" alt="image" src="https://github.com/databricks/databricks-sdk-py/assets/17158624/99a91d82-f06a-4883-b131-7f96a82edd80"> <img width="818" alt="image" src="https://github.com/databricks/databricks-sdk-py/assets/17158624/ce684fd0-f550-4afe-bc46-1187b6dd4b49"> <!-- How is this tested? Please see the checklist below and also describe any other relevant tests --> - [x] `make test` run locally - [x] `make fmt` applied - [ ] relevant integration tests applied [DECO-1115] Add local implementation for `dbutils.widgets` (databricks#93) * Added a new install group (`pip install 'databricks-sdk[notebook]'`). This allows us to safely pin ipywidgets for local installs. DBR can safely continue using `pip install databricks-sdk` or directly using the default build from master without conflicting dependencies. * OSS implementation of widgets is imported only on first use (possible only through OSS implementation of dbutils - `RemoteDbutils`). * Add a wrapper for ipywidgets to enable interactive widgets when in interactive **IPython** notebooks. https://user-images.githubusercontent.com/88345179/236443693-1c804107-ba21-4296-ba40-2b1e8e062d16.mov * Add default widgets implementation that returns a default value, when not in an interactive environment. https://user-images.githubusercontent.com/88345179/236443729-51185404-4d28-49c6-ade0-a665e154e092.mov <!-- How is this tested? Please see the checklist below and also describe any other relevant tests --> - [x] `make test` run locally - [x] `make fmt` applied - [ ] relevant integration tests applied Fix error message, ExportFormat -> ImportFormat (databricks#220) The proper argument is ImportFormat.AUTO, not ExportFormat.AUTO Correct the error message when `ImportFormat` is not provided to `workspace.upload`. Signed-off-by: Jessica Smith <8505845+NodeJSmith@users.noreply.github.com> Regenerate Python SDK using recent OpenAPI Specification (databricks#229) Spec commit sha: 17a3f7fe6 (7 July 2023) Breaking Changes: * Use CONSTANT_CASE for Enum constants. Many enums already use constant case in their definition, but some constants (like the SCIM Patch schema name) includes symbols `:` and numbers, so the SDK cannot use the enum value as the name. * Replace Query type with AlertQuery in sql.Alert class. * Removal of User.is_db_admin and User.profile_image_url. Changes: * Introduce CleanRooms API * Introduce TablesAPI.update() * Introduce Group.meta property * Fix SCIM Patch implementation * Introduce BaseRun.job_parameters and BaseRun.trigger_info * Introduce CreateJob.parameters * Fix spelling in file arrival trigger configuration * Introduce GitSource.job_source * Introduce RepairRun.rerun_dependent_tasks * Introduce Resolved*Values classes, RunIf, and RunJobTask * Introduce TaskNotificationSettings Later follow-up: * Names should split on Pascal-case word boundaries (see CloudProviderNodeStatus). This is an OpenAPI code gen change that needs to be made. Make workspace client also return runtime dbutils when in dbr (databricks#210) * `workspace_client.dbutils` always returned oss implementation of dbutils. We want it to also use dbr implementation when in dbr. * [x] Manually Test in dbr * [x] Test locally - [x] `make test` run locally - [x] `make fmt` applied - [ ] relevant integration tests applied Use .ConstantName defining target enum states for waiters (databricks#230) Uses of enums in generated code need to be updated to use `{{.ConstantName}}` instead of `{{.Content}}`. - [ ] `make test` run locally - [ ] `make fmt` applied - [ ] relevant integration tests applied Fix enum deserialization, take 2 (databricks#235) We jumped the gun too quickly on databricks#234. This is the actual change which fixes the integration tests. - [x] The two failing integration tests (test_submitting_jobs and test_proxy_dbfs_mounts) both pass on this PR. Added toolchain configuration to `.codegen.json` (databricks#236) - Added toolchain config for automated releases - Added `CHANGELOG.md` template with OpenAPI SHA prep release changes Make OpenAPI spec location configurable (databricks#237) Introducing `DATABRICKS_OPENAPI_SPEC` environment variable to hold a filesystem location of `all-internal.json` spec. Rearrange imports in `databricks.sdk.runtime` to improve local editor experience (databricks#219) <!-- Summary of your changes that are easy to understand --> The type hinting here means that VSCode does not give useful syntax highlights / code completions. This is the current experience on `main`: <img width="428" alt="image" src="https://github.com/databricks/databricks-sdk-py/assets/17158624/72d2c3eb-cc3a-4f95-9f09-7d43a8f2815e"> <img width="428" alt="image" src="https://github.com/databricks/databricks-sdk-py/assets/17158624/34b04de0-5996-4a2e-a0d2-b26a8c9d3da9"> With these changes this becomes: <img width="428" alt="image" src="https://github.com/databricks/databricks-sdk-py/assets/17158624/99a91d82-f06a-4883-b131-7f96a82edd80"> <img width="818" alt="image" src="https://github.com/databricks/databricks-sdk-py/assets/17158624/ce684fd0-f550-4afe-bc46-1187b6dd4b49"> <!-- How is this tested? Please see the checklist below and also describe any other relevant tests --> - [x] `make test` run locally - [x] `make fmt` applied - [ ] relevant integration tests applied
* Add Issue Templates ([#208](#208)). * Fixed notebook native auth for jobs ([#209](#209)). * Replace `datatime.timedelta()` with `datetime.timedelta()` in codebase ([#207](#207)). * Support dod in python sdk ([#212](#212)). * [DECO-1115] Add local implementation for `dbutils.widgets` ([#93](#93)). * Fix error message, ExportFormat -> ImportFormat ([#220](#220)). * Regenerate Python SDK using recent OpenAPI Specification ([#229](#229)). * Make workspace client also return runtime dbutils when in dbr ([#210](#210)). * Use .ConstantName defining target enum states for waiters ([#230](#230)). * Fix enum deserialization ([#234](#234)). * Fix enum deserialization, take 2 ([#235](#235)). * Added toolchain configuration to `.codegen.json` ([#236](#236)). * Make OpenAPI spec location configurable ([#237](#237)). * Rearrange imports in `databricks.sdk.runtime` to improve local editor experience ([#219](#219)). API Changes: * Removed `maintenance()` method for [w.metastores](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/metastores.html) workspace-level service. * Added `enable_optimization()` method for [w.metastores](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/metastores.html) workspace-level service. * Added `update()` method for [w.tables](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/tables.html) workspace-level service. * Added `force` field for `databricks.sdk.service.catalog.DeleteAccountMetastoreRequest`. * Added `force` field for `databricks.sdk.service.catalog.DeleteAccountStorageCredentialRequest`. * Removed `databricks.sdk.service.catalog.UpdateAutoMaintenance` dataclass. * Removed `databricks.sdk.service.catalog.UpdateAutoMaintenanceResponse` dataclass. * Added `databricks.sdk.service.catalog.UpdatePredictiveOptimization` dataclass. * Added `databricks.sdk.service.catalog.UpdatePredictiveOptimizationResponse` dataclass. * Added `databricks.sdk.service.catalog.UpdateTableRequest` dataclass. * Added `schema` field for `databricks.sdk.service.iam.PartialUpdate`. * Added `databricks.sdk.service.iam.PatchSchema` dataclass. * Added `trigger_info` field for `databricks.sdk.service.jobs.BaseRun`. * Added `health` field for `databricks.sdk.service.jobs.CreateJob`. * Added `job_source` field for `databricks.sdk.service.jobs.GitSource`. * Added `on_duration_warning_threshold_exceeded` field for `databricks.sdk.service.jobs.JobEmailNotifications`. * Added `health` field for `databricks.sdk.service.jobs.JobSettings`. * Added `trigger_info` field for `databricks.sdk.service.jobs.Run`. * Added `run_job_output` field for `databricks.sdk.service.jobs.RunOutput`. * Added `run_job_task` field for `databricks.sdk.service.jobs.RunTask`. * Added `email_notifications` field for `databricks.sdk.service.jobs.SubmitRun`. * Added `health` field for `databricks.sdk.service.jobs.SubmitRun`. * Added `email_notifications` field for `databricks.sdk.service.jobs.SubmitTask`. * Added `health` field for `databricks.sdk.service.jobs.SubmitTask`. * Added `notification_settings` field for `databricks.sdk.service.jobs.SubmitTask`. * Added `health` field for `databricks.sdk.service.jobs.Task`. * Added `run_job_task` field for `databricks.sdk.service.jobs.Task`. * Added `on_duration_warning_threshold_exceeded` field for `databricks.sdk.service.jobs.TaskEmailNotifications`. * Added `on_duration_warning_threshold_exceeded` field for `databricks.sdk.service.jobs.WebhookNotifications`. * Added `databricks.sdk.service.jobs.JobSource` dataclass. * Added `databricks.sdk.service.jobs.JobSourceDirtyState` dataclass. * Added `databricks.sdk.service.jobs.JobsHealthMetric` dataclass. * Added `databricks.sdk.service.jobs.JobsHealthOperator` dataclass. * Added `databricks.sdk.service.jobs.JobsHealthRule` dataclass. * Added `databricks.sdk.service.jobs.JobsHealthRules` dataclass. * Added `databricks.sdk.service.jobs.RunJobOutput` dataclass. * Added `databricks.sdk.service.jobs.RunJobTask` dataclass. * Added `databricks.sdk.service.jobs.TriggerInfo` dataclass. * Added `databricks.sdk.service.jobs.WebhookNotificationsOnDurationWarningThresholdExceededItem` dataclass. * Removed `whl` field for `databricks.sdk.service.pipelines.PipelineLibrary`. * Changed `delete_personal_compute_setting()` method for [a.account_settings](https://databricks-sdk-py.readthedocs.io/en/latest/account/account_settings.html) account-level service with new required argument order. * Changed `read_personal_compute_setting()` method for [a.account_settings](https://databricks-sdk-py.readthedocs.io/en/latest/account/account_settings.html) account-level service with new required argument order. * Changed `etag` field for `databricks.sdk.service.settings.DeletePersonalComputeSettingRequest` to be required. * Changed `etag` field for `databricks.sdk.service.settings.ReadPersonalComputeSettingRequest` to be required. * Added [w.clean_rooms](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/clean_rooms.html) workspace-level service. * Added `databricks.sdk.service.sharing.CentralCleanRoomInfo` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomAssetInfo` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomCatalog` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomCatalogUpdate` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomCollaboratorInfo` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomInfo` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomNotebookInfo` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomTableInfo` dataclass. * Added `databricks.sdk.service.sharing.ColumnInfo` dataclass. * Added `databricks.sdk.service.sharing.ColumnMask` dataclass. * Added `databricks.sdk.service.sharing.ColumnTypeName` dataclass. * Added `databricks.sdk.service.sharing.CreateCleanRoom` dataclass. * Added `databricks.sdk.service.sharing.DeleteCleanRoomRequest` dataclass. * Added `databricks.sdk.service.sharing.GetCleanRoomRequest` dataclass. * Added `databricks.sdk.service.sharing.ListCleanRoomsResponse` dataclass. * Added `databricks.sdk.service.sharing.UpdateCleanRoom` dataclass. * Changed `query` field for `databricks.sdk.service.sql.Alert` to `databricks.sdk.service.sql.AlertQuery` dataclass. * Changed `value` field for `databricks.sdk.service.sql.AlertOptions` to `any` dataclass. * Removed `is_db_admin` field for `databricks.sdk.service.sql.User`. * Removed `profile_image_url` field for `databricks.sdk.service.sql.User`. * Added `databricks.sdk.service.sql.AlertQuery` dataclass. OpenAPI SHA: 36bb2292d778b9955eb3b799a39be94a83049b43, Date: 2023-07-18
* Add Issue Templates ([#208](#208)). * Fixed notebook native auth for jobs ([#209](#209)). * Replace `datatime.timedelta()` with `datetime.timedelta()` in codebase ([#207](#207)). * Support dod in python sdk ([#212](#212)). * [DECO-1115] Add local implementation for `dbutils.widgets` ([#93](#93)). * Fix error message, ExportFormat -> ImportFormat ([#220](#220)). * Regenerate Python SDK using recent OpenAPI Specification ([#229](#229)). * Make workspace client also return runtime dbutils when in dbr ([#210](#210)). * Use .ConstantName defining target enum states for waiters ([#230](#230)). * Fix enum deserialization ([#234](#234)). * Fix enum deserialization, take 2 ([#235](#235)). * Added toolchain configuration to `.codegen.json` ([#236](#236)). * Make OpenAPI spec location configurable ([#237](#237)). * Rearrange imports in `databricks.sdk.runtime` to improve local editor experience ([#219](#219)). * Updated account-level and workspace-level user management examples ([#241](#241)). API Changes: * Removed `maintenance()` method for [w.metastores](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/metastores.html) workspace-level service. * Added `enable_optimization()` method for [w.metastores](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/metastores.html) workspace-level service. * Added `update()` method for [w.tables](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/tables.html) workspace-level service. * Added `force` field for `databricks.sdk.service.catalog.DeleteAccountMetastoreRequest`. * Added `force` field for `databricks.sdk.service.catalog.DeleteAccountStorageCredentialRequest`. * Removed `databricks.sdk.service.catalog.UpdateAutoMaintenance` dataclass. * Removed `databricks.sdk.service.catalog.UpdateAutoMaintenanceResponse` dataclass. * Added `databricks.sdk.service.catalog.UpdatePredictiveOptimization` dataclass. * Added `databricks.sdk.service.catalog.UpdatePredictiveOptimizationResponse` dataclass. * Added `databricks.sdk.service.catalog.UpdateTableRequest` dataclass. * Added `schema` field for `databricks.sdk.service.iam.PartialUpdate`. * Added `databricks.sdk.service.iam.PatchSchema` dataclass. * Added `trigger_info` field for `databricks.sdk.service.jobs.BaseRun`. * Added `health` field for `databricks.sdk.service.jobs.CreateJob`. * Added `job_source` field for `databricks.sdk.service.jobs.GitSource`. * Added `on_duration_warning_threshold_exceeded` field for `databricks.sdk.service.jobs.JobEmailNotifications`. * Added `health` field for `databricks.sdk.service.jobs.JobSettings`. * Added `trigger_info` field for `databricks.sdk.service.jobs.Run`. * Added `run_job_output` field for `databricks.sdk.service.jobs.RunOutput`. * Added `run_job_task` field for `databricks.sdk.service.jobs.RunTask`. * Added `email_notifications` field for `databricks.sdk.service.jobs.SubmitRun`. * Added `health` field for `databricks.sdk.service.jobs.SubmitRun`. * Added `email_notifications` field for `databricks.sdk.service.jobs.SubmitTask`. * Added `health` field for `databricks.sdk.service.jobs.SubmitTask`. * Added `notification_settings` field for `databricks.sdk.service.jobs.SubmitTask`. * Added `health` field for `databricks.sdk.service.jobs.Task`. * Added `run_job_task` field for `databricks.sdk.service.jobs.Task`. * Added `on_duration_warning_threshold_exceeded` field for `databricks.sdk.service.jobs.TaskEmailNotifications`. * Added `on_duration_warning_threshold_exceeded` field for `databricks.sdk.service.jobs.WebhookNotifications`. * Added `databricks.sdk.service.jobs.JobSource` dataclass. * Added `databricks.sdk.service.jobs.JobSourceDirtyState` dataclass. * Added `databricks.sdk.service.jobs.JobsHealthMetric` dataclass. * Added `databricks.sdk.service.jobs.JobsHealthOperator` dataclass. * Added `databricks.sdk.service.jobs.JobsHealthRule` dataclass. * Added `databricks.sdk.service.jobs.JobsHealthRules` dataclass. * Added `databricks.sdk.service.jobs.RunJobOutput` dataclass. * Added `databricks.sdk.service.jobs.RunJobTask` dataclass. * Added `databricks.sdk.service.jobs.TriggerInfo` dataclass. * Added `databricks.sdk.service.jobs.WebhookNotificationsOnDurationWarningThresholdExceededItem` dataclass. * Removed `whl` field for `databricks.sdk.service.pipelines.PipelineLibrary`. * Changed `delete_personal_compute_setting()` method for [a.account_settings](https://databricks-sdk-py.readthedocs.io/en/latest/account/account_settings.html) account-level service with new required argument order. * Changed `read_personal_compute_setting()` method for [a.account_settings](https://databricks-sdk-py.readthedocs.io/en/latest/account/account_settings.html) account-level service with new required argument order. * Changed `etag` field for `databricks.sdk.service.settings.DeletePersonalComputeSettingRequest` to be required. * Changed `etag` field for `databricks.sdk.service.settings.ReadPersonalComputeSettingRequest` to be required. * Added [w.clean_rooms](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/clean_rooms.html) workspace-level service. * Added `databricks.sdk.service.sharing.CentralCleanRoomInfo` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomAssetInfo` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomCatalog` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomCatalogUpdate` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomCollaboratorInfo` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomInfo` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomNotebookInfo` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomTableInfo` dataclass. * Added `databricks.sdk.service.sharing.ColumnInfo` dataclass. * Added `databricks.sdk.service.sharing.ColumnMask` dataclass. * Added `databricks.sdk.service.sharing.ColumnTypeName` dataclass. * Added `databricks.sdk.service.sharing.CreateCleanRoom` dataclass. * Added `databricks.sdk.service.sharing.DeleteCleanRoomRequest` dataclass. * Added `databricks.sdk.service.sharing.GetCleanRoomRequest` dataclass. * Added `databricks.sdk.service.sharing.ListCleanRoomsResponse` dataclass. * Added `databricks.sdk.service.sharing.UpdateCleanRoom` dataclass. * Changed `query` field for `databricks.sdk.service.sql.Alert` to `databricks.sdk.service.sql.AlertQuery` dataclass. * Changed `value` field for `databricks.sdk.service.sql.AlertOptions` to `any` dataclass. * Removed `is_db_admin` field for `databricks.sdk.service.sql.User`. * Removed `profile_image_url` field for `databricks.sdk.service.sql.User`. * Added `databricks.sdk.service.sql.AlertQuery` dataclass. OpenAPI SHA: 0a1949ba96f71680dad30e06973eaae85b1307bb, Date: 2023-07-18
* Add Issue Templates ([#208](#208)). * Fixed notebook native auth for jobs ([#209](#209)). * Replace `datatime.timedelta()` with `datetime.timedelta()` in codebase ([#207](#207)). * Support dod in python sdk ([#212](#212)). * [DECO-1115] Add local implementation for `dbutils.widgets` ([#93](#93)). * Fix error message, ExportFormat -> ImportFormat ([#220](#220)). * Regenerate Python SDK using recent OpenAPI Specification ([#229](#229)). * Make workspace client also return runtime dbutils when in dbr ([#210](#210)). * Use .ConstantName defining target enum states for waiters ([#230](#230)). * Fix enum deserialization ([#234](#234)). * Fix enum deserialization, take 2 ([#235](#235)). * Added toolchain configuration to `.codegen.json` ([#236](#236)). * Make OpenAPI spec location configurable ([#237](#237)). * Rearrange imports in `databricks.sdk.runtime` to improve local editor experience ([#219](#219)). * Updated account-level and workspace-level user management examples ([#241](#241)). API Changes: * Removed `maintenance()` method for [w.metastores](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/metastores.html) workspace-level service. * Added `enable_optimization()` method for [w.metastores](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/metastores.html) workspace-level service. * Added `update()` method for [w.tables](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/tables.html) workspace-level service. * Added `force` field for `databricks.sdk.service.catalog.DeleteAccountMetastoreRequest`. * Added `force` field for `databricks.sdk.service.catalog.DeleteAccountStorageCredentialRequest`. * Removed `databricks.sdk.service.catalog.UpdateAutoMaintenance` dataclass. * Removed `databricks.sdk.service.catalog.UpdateAutoMaintenanceResponse` dataclass. * Added `databricks.sdk.service.catalog.UpdatePredictiveOptimization` dataclass. * Added `databricks.sdk.service.catalog.UpdatePredictiveOptimizationResponse` dataclass. * Added `databricks.sdk.service.catalog.UpdateTableRequest` dataclass. * Added `schema` field for `databricks.sdk.service.iam.PartialUpdate`. * Added `databricks.sdk.service.iam.PatchSchema` dataclass. * Added `trigger_info` field for `databricks.sdk.service.jobs.BaseRun`. * Added `health` field for `databricks.sdk.service.jobs.CreateJob`. * Added `job_source` field for `databricks.sdk.service.jobs.GitSource`. * Added `on_duration_warning_threshold_exceeded` field for `databricks.sdk.service.jobs.JobEmailNotifications`. * Added `health` field for `databricks.sdk.service.jobs.JobSettings`. * Added `trigger_info` field for `databricks.sdk.service.jobs.Run`. * Added `run_job_output` field for `databricks.sdk.service.jobs.RunOutput`. * Added `run_job_task` field for `databricks.sdk.service.jobs.RunTask`. * Added `email_notifications` field for `databricks.sdk.service.jobs.SubmitRun`. * Added `health` field for `databricks.sdk.service.jobs.SubmitRun`. * Added `email_notifications` field for `databricks.sdk.service.jobs.SubmitTask`. * Added `health` field for `databricks.sdk.service.jobs.SubmitTask`. * Added `notification_settings` field for `databricks.sdk.service.jobs.SubmitTask`. * Added `health` field for `databricks.sdk.service.jobs.Task`. * Added `run_job_task` field for `databricks.sdk.service.jobs.Task`. * Added `on_duration_warning_threshold_exceeded` field for `databricks.sdk.service.jobs.TaskEmailNotifications`. * Added `on_duration_warning_threshold_exceeded` field for `databricks.sdk.service.jobs.WebhookNotifications`. * Added `databricks.sdk.service.jobs.JobSource` dataclass. * Added `databricks.sdk.service.jobs.JobSourceDirtyState` dataclass. * Added `databricks.sdk.service.jobs.JobsHealthMetric` dataclass. * Added `databricks.sdk.service.jobs.JobsHealthOperator` dataclass. * Added `databricks.sdk.service.jobs.JobsHealthRule` dataclass. * Added `databricks.sdk.service.jobs.JobsHealthRules` dataclass. * Added `databricks.sdk.service.jobs.RunJobOutput` dataclass. * Added `databricks.sdk.service.jobs.RunJobTask` dataclass. * Added `databricks.sdk.service.jobs.TriggerInfo` dataclass. * Added `databricks.sdk.service.jobs.WebhookNotificationsOnDurationWarningThresholdExceededItem` dataclass. * Removed `whl` field for `databricks.sdk.service.pipelines.PipelineLibrary`. * Changed `delete_personal_compute_setting()` method for [a.account_settings](https://databricks-sdk-py.readthedocs.io/en/latest/account/account_settings.html) account-level service with new required argument order. * Changed `read_personal_compute_setting()` method for [a.account_settings](https://databricks-sdk-py.readthedocs.io/en/latest/account/account_settings.html) account-level service with new required argument order. * Changed `etag` field for `databricks.sdk.service.settings.DeletePersonalComputeSettingRequest` to be required. * Changed `etag` field for `databricks.sdk.service.settings.ReadPersonalComputeSettingRequest` to be required. * Added [w.clean_rooms](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/clean_rooms.html) workspace-level service. * Added `databricks.sdk.service.sharing.CentralCleanRoomInfo` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomAssetInfo` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomCatalog` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomCatalogUpdate` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomCollaboratorInfo` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomInfo` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomNotebookInfo` dataclass. * Added `databricks.sdk.service.sharing.CleanRoomTableInfo` dataclass. * Added `databricks.sdk.service.sharing.ColumnInfo` dataclass. * Added `databricks.sdk.service.sharing.ColumnMask` dataclass. * Added `databricks.sdk.service.sharing.ColumnTypeName` dataclass. * Added `databricks.sdk.service.sharing.CreateCleanRoom` dataclass. * Added `databricks.sdk.service.sharing.DeleteCleanRoomRequest` dataclass. * Added `databricks.sdk.service.sharing.GetCleanRoomRequest` dataclass. * Added `databricks.sdk.service.sharing.ListCleanRoomsResponse` dataclass. * Added `databricks.sdk.service.sharing.UpdateCleanRoom` dataclass. * Changed `query` field for `databricks.sdk.service.sql.Alert` to `databricks.sdk.service.sql.AlertQuery` dataclass. * Changed `value` field for `databricks.sdk.service.sql.AlertOptions` to `any` dataclass. * Removed `is_db_admin` field for `databricks.sdk.service.sql.User`. * Removed `profile_image_url` field for `databricks.sdk.service.sql.User`. * Added `databricks.sdk.service.sql.AlertQuery` dataclass. OpenAPI SHA: 0a1949ba96f71680dad30e06973eaae85b1307bb, Date: 2023-07-18 [DECO-1115]: https://databricks.atlassian.net/browse/DECO-1115?atlOrigin=eyJpIjoiNWRkNTljNzYxNjVmNDY3MDlhMDU5Y2ZhYzA5YTRkZjUiLCJwIjoiZ2l0aHViLWNvbS1KU1cifQ
Changes
workspace_client.dbutils
always returned oss implementation of dbutils. We want it to also use dbr implementation when in dbr.Tests
make test
run locallymake fmt
applied